WorldWideScience

Sample records for evolutionary computing electronic

  1. Topics of Evolutionary Computation 2001

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær

    This booklet contains the student reports from the course: Topics of Evolutionary Computation, Fall 2001, given by Thiemo Krink, Rene Thomsen and Rasmus K. Ursem......This booklet contains the student reports from the course: Topics of Evolutionary Computation, Fall 2001, given by Thiemo Krink, Rene Thomsen and Rasmus K. Ursem...

  2. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    evolutionary algorithms, such as memetic algorithms, which have emerged as a very promising tool for solving many real-world problems in a multitude of areas of science and technology. Moreover, parallel evolutionary combinatorial optimization has been presented. Search operators, which are crucial in all...

  3. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,

  4. Scalable computing for evolutionary genomics.

    Science.gov (United States)

    Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert

    2012-01-01

    Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project

  5. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  6. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  7. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  8. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  9. Computation Environments (2) Persistently Evolutionary Semantics

    OpenAIRE

    Ramezanian, Rasoul

    2012-01-01

    In the manuscript titled "Computation environment (1)", we introduced a notion called computation environment as an interactive model for computation and complexity theory. In this model, Turing machines are not autonomous entities and find their meanings through the interaction between a computist and a universal processor, and thus due to evolution of the universal processor, the meanings of Turing machines could change. In this manuscript, we discuss persistently evolutionary intensions. W...

  10. Electronics and computer acronyms

    CERN Document Server

    Brown, Phil

    1988-01-01

    Electronics and Computer Acronyms presents a list of almost 2,500 acronyms related to electronics and computers. The material for this book is drawn from a number of subject areas, including electrical, electronics, computers, telecommunications, fiber optics, microcomputers/microprocessors, audio, video, and information technology. The acronyms also encompass avionics, military, data processing, instrumentation, units, measurement, standards, services, organizations, associations, and companies. This dictionary offers a comprehensive and broad view of electronics and all that is associated wi

  11. From computers to cultivation: reconceptualizing evolutionary psychology

    Science.gov (United States)

    Barrett, Louise; Pollet, Thomas V.; Stulp, Gert

    2014-01-01

    Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on “cognitive integration” or the “extended mind hypothesis” in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human “mind-making” within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach. PMID:25161633

  12. From computers to cultivation: reconceptualizing evolutionary psychology

    Directory of Open Access Journals (Sweden)

    Louise eBarrett

    2014-08-01

    Full Text Available Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behaviour, and the rigour with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific computational mechanisms, or modules (the massive modularity hypothesis. Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive represent a true alternative to standard computational approaches, with an emphasis on cognitive integration or the extended mind hypothesis in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human ‘mind-making’ within an evolutionarily-informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach.

  13. From computers to cultivation: reconceptualizing evolutionary psychology.

    Science.gov (United States)

    Barrett, Louise; Pollet, Thomas V; Stulp, Gert

    2014-01-01

    Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on "cognitive integration" or the "extended mind hypothesis" in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human "mind-making" within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach.

  14. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  15. Evolutionary Games and Computer Simulations

    CERN Document Server

    Huberman, B A; Huberman, Bernardo A.; Glance, Natalie S.

    1993-01-01

    Abstract: The prisoner's dilemma has long been considered the paradigm for studying the emergence of cooperation among selfish individuals. Because of its importance, it has been studied through computer experiments as well as in the laboratory and by analytical means. However, there are important differences between the way a system composed of many interacting elements is simulated by a digital machine and the manner in which it behaves when studied in real experiments. In some instances, these disparities can be marked enough so as to cast doubt on the implications of cellular automata type simulations for the study of cooperation in social systems. In particular, if such a simulation imposes space-time granularity, then its ability to describe the real world may be compromised. Indeed, we show that the results of digital simulations regarding territoriality and cooperation differ greatly when time is discrete as opposed to continuous.

  16. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  17. Evolutionary computation techniques a comparative perspective

    CERN Document Server

    Cuevas, Erik; Oliva, Diego

    2017-01-01

    This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.

  18. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  19. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  20. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  1. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  2. Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Biaobiao Zhang

    2011-01-01

    Full Text Available Neural networks and fuzzy systems are two soft-computing paradigms for system modelling. Adapting a neural or fuzzy system requires to solve two optimization problems: structural optimization and parametric optimization. Structural optimization is a discrete optimization problem which is very hard to solve using conventional optimization techniques. Parametric optimization can be solved using conventional optimization techniques, but the solution may be easily trapped at a bad local optimum. Evolutionary computation is a general-purpose stochastic global optimization approach under the universally accepted neo-Darwinian paradigm, which is a combination of the classical Darwinian evolutionary theory, the selectionism of Weismann, and the genetics of Mendel. Evolutionary algorithms are a major approach to adaptation and optimization. In this paper, we first introduce evolutionary algorithms with emphasis on genetic algorithms and evolutionary strategies. Other evolutionary algorithms such as genetic programming, evolutionary programming, particle swarm optimization, immune algorithm, and ant colony optimization are also described. Some topics pertaining to evolutionary algorithms are also discussed, and a comparison between evolutionary algorithms and simulated annealing is made. Finally, the application of EAs to the learning of neural networks as well as to the structural and parametric adaptations of fuzzy systems is also detailed.

  3. Computational and evolutionary aspects of language

    Science.gov (United States)

    Nowak, Martin A.; Komarova, Natalia L.; Niyogi, Partha

    2002-06-01

    Language is our legacy. It is the main evolutionary contribution of humans, and perhaps the most interesting trait that has emerged in the past 500 million years. Understanding how darwinian evolution gives rise to human language requires the integration of formal language theory, learning theory and evolutionary dynamics. Formal language theory provides a mathematical description of language and grammar. Learning theory formalizes the task of language acquisition-it can be shown that no procedure can learn an unrestricted set of languages. Universal grammar specifies the restricted set of languages learnable by the human brain. Evolutionary dynamics can be formulated to describe the cultural evolution of language and the biological evolution of universal grammar.

  4. Conversion Rate Optimization through Evolutionary Computation

    OpenAIRE

    Miikkulainen, Risto; Iscoe, Neil; Shagrin, Aaron; Cordell, Ron; Nazari, Sam; Schoolland, Cory; Brundage, Myles; Epstein, Jonathan; Dean, Randy; Lamba, Gurmeet

    2017-01-01

    Conversion optimization means designing a web interface so that as many users as possible take a desired action on it, such as register or purchase. Such design is usually done by hand, testing one change at a time through A/B testing, or a limited number of combinations through multivariate testing, making it possible to evaluate only a small fraction of designs in a vast design space. This paper describes Sentient Ascend, an automatic conversion optimization system that uses evolutionary op...

  5. Advanced computing in electron microscopy

    CERN Document Server

    Kirkland, Earl J

    2010-01-01

    This book features numerical computation of electron microscopy images as well as multislice methods High resolution CTEM and STEM image interpretation are included in the text This newly updated second edition will bring the reader up to date on new developments in the field since the 1990's The only book that specifically addresses computer simulation methods in electron microscopy

  6. 10th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Wang, Chia-Hung; Jiang, Xin

    2017-01-01

    This book gathers papers presented at the 10th International Conference on Genetic and Evolutionary Computing (ICGEC 2016). The conference was co-sponsored by Springer, Fujian University of Technology in China, the University of Computer Studies in Yangon, University of Miyazaki in Japan, National Kaohsiung University of Applied Sciences in Taiwan, Taiwan Association for Web Intelligence Consortium, and VSB-Technical University of Ostrava, Czech Republic. The ICGEC 2016, which was held from November 7 to 9, 2016 in Fuzhou City, China, was intended as an international forum for researchers and professionals in all areas of genetic and evolutionary computing.

  7. Evolutionary Computation Techniques for Predicting Atmospheric Corrosion

    Directory of Open Access Journals (Sweden)

    Amine Marref

    2013-01-01

    Full Text Available Corrosion occurs in many engineering structures such as bridges, pipelines, and refineries and leads to the destruction of materials in a gradual manner and thus shortening their lifespan. It is therefore crucial to assess the structural integrity of engineering structures which are approaching or exceeding their designed lifespan in order to ensure their correct functioning, for example, carrying ability and safety. An understanding of corrosion and an ability to predict corrosion rate of a material in a particular environment plays a vital role in evaluating the residual life of the material. In this paper we investigate the use of genetic programming and genetic algorithms in the derivation of corrosion-rate expressions for steel and zinc. Genetic programming is used to automatically evolve corrosion-rate expressions while a genetic algorithm is used to evolve the parameters of an already engineered corrosion-rate expression. We show that both evolutionary techniques yield corrosion-rate expressions that have good accuracy.

  8. Coevolution of Artificial Agents Using Evolutionary Computation in Bargaining Game

    Directory of Open Access Journals (Sweden)

    Sangwook Lee

    2015-01-01

    Full Text Available Analysis of bargaining game using evolutionary computation is essential issue in the field of game theory. This paper investigates the interaction and coevolutionary process among heterogeneous artificial agents using evolutionary computation (EC in the bargaining game. In particular, the game performance with regard to payoff through the interaction and coevolution of agents is studied. We present three kinds of EC based agents (EC-agent participating in the bargaining game: genetic algorithm (GA, particle swarm optimization (PSO, and differential evolution (DE. The agents’ performance with regard to changing condition is compared. From the simulation results it is found that the PSO-agent is superior to the other agents.

  9. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  10. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  11. Studying Collective Human Decision Making and Creativity with Evolutionary Computation.

    Science.gov (United States)

    Sayama, Hiroki; Dionne, Shelley D

    2015-01-01

    We report a summary of our interdisciplinary research project "Evolutionary Perspective on Collective Decision Making" that was conducted through close collaboration between computational, organizational, and social scientists at Binghamton University. We redefined collective human decision making and creativity as evolution of ecologies of ideas, where populations of ideas evolve via continual applications of evolutionary operators such as reproduction, recombination, mutation, selection, and migration of ideas, each conducted by participating humans. Based on this evolutionary perspective, we generated hypotheses about collective human decision making, using agent-based computer simulations. The hypotheses were then tested through several experiments with real human subjects. Throughout this project, we utilized evolutionary computation (EC) in non-traditional ways-(1) as a theoretical framework for reinterpreting the dynamics of idea generation and selection, (2) as a computational simulation model of collective human decision-making processes, and (3) as a research tool for collecting high-resolution experimental data on actual collaborative design and decision making from human subjects. We believe our work demonstrates untapped potential of EC for interdisciplinary research involving human and social dynamics.

  12. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  13. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary .... A task without any parent is called an entry task and a task without any child is called an exit task. In the Directed Acyclic ..... The Computer Journal 48(3): 300–314. Dongarra J J, Jeannot E, Saule E, Shi ...

  14. Introduction to electronic analogue computers

    CERN Document Server

    Wass, C A A

    1965-01-01

    Introduction to Electronic Analogue Computers, Second Revised Edition is based on the ideas and experience of a group of workers at the Royal Aircraft Establishment, Farnborough, Hants. This edition is almost entirely the work of Mr. K. C. Garner, of the College of Aeronautics, Cranfield. As various advances have been made in the technology involving electronic analogue computers, this book presents discussions on the said progress, including some acquaintance with the capabilities of electronic circuits and equipment. This text also provides a mathematical background including simple differen

  15. Computer electronics made simple computerbooks

    CERN Document Server

    Bourdillon, J F B

    1975-01-01

    Computer Electronics: Made Simple Computerbooks presents the basics of computer electronics and explains how a microprocessor works. Various types of PROMs, static RAMs, dynamic RAMs, floppy disks, and hard disks are considered, along with microprocessor support devices made by Intel, Motorola and Zilog. Bit slice logic and some AMD bit slice products are also described. Comprised of 14 chapters, this book begins with an introduction to the fundamentals of hardware design, followed by a discussion on the basic building blocks of hardware (NAND, NOR, AND, OR, NOT, XOR); tools and equipment that

  16. Fast and Deterministic Computation of Fixation Probability in Evolutionary Graphs

    Science.gov (United States)

    2012-11-07

    and Dept. of Electrical Engineering and Computer Science United States Military Academy West Point, NY email: paulo.shakarian@usma.edu Patrick Roos Dept...18) 2006, 188104. [6] M. Broom , C. Hadjichrysanthou, J. Rychtar, B. T. Stadler, Two results on evolutionary processes on general non-directed graphs...Journal of Com- putational and Mathematical Sciences 2 (1). [9] M. Broom , J. Rychtář, An analysis of the fixation probability of a mutant on special

  17. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  18. Multiple von Neumann computers: an evolutionary approach to functional emergence.

    Science.gov (United States)

    Suzuki, H

    1997-01-01

    A novel system composed of multiple von Neumann computers and an appropriate problem environment is proposed and simulated. Each computer has a memory to store the machine instruction program, and when a program is executed, a series of machine codes in the memory is sequentially decoded, leading to register operations in the central processing unit (CPU). By means of these operations, the computer not only can handle its generally used registers but also can read and write the environmental database. Simulation is driven by genetic algorithms (GAs) performed on the population of program memories. Mutation and crossover create program diversity in the memory, and selection facilitates the reproduction of appropriate programs. Through these evolutionary operations, advantageous combinations of machine codes are created and fixed in the population one by one, and the higher function, which enables the computer to calculate an appropriate number from the environment, finally emerges in the program memory. In the latter half of the article, the performance of GAs on this system is studied. Under different sets of parameters, the evolutionary speed, which is determined by the time until the domination of the final program, is examined and the conditions for faster evolution are clarified. At an intermediate mutation rate and at an intermediate population size, crossover helps create novel advantageous sets of machine codes and evidently accelerates optimization by GAs.

  19. Computer simulation of electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Sabchevski, S.P.; Mladenov, G.M. (Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. po Elektronika)

    1994-04-14

    Self-fields and forces as well as the local degree of space-charge neutralization in overcompensated electron beams are considered. The radial variation of the local degree of space-charge neutralization is analysed. A novel model which describes the equilibrium potential distribution in overcompensated beams is proposed and a method for computer simulation of the beam propagation is described. Results from numerical experiments which illustrate the propagation of finite emittance overneutralized beams are presented. (Author).

  20. 9th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro; Genetic and Evolutionary Computing

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2015, the 9th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Ministry of Science and Technology, Myanmar, University of Computer Studies, Yangon, University of Miyazaki in Japan, Kaohsiung University of Applied Science in Taiwan, Fujian University of Technology in China and VSB-Technical University of Ostrava. ICGEC 2015 is held from 26-28, August, 2015 in Yangon, Myanmar. Yangon, the most multiethnic and cosmopolitan city in Myanmar, is the main gateway to the country. Despite being the commercial capital of Myanmar, Yangon is a city engulfed by its rich history and culture, an integration of ancient traditions and spiritual heritage. The stunning SHWEDAGON Pagoda is the center piece of Yangon city, which itself is famous for the best British colonial era architecture. Of particular interest in many shops of Bogyoke Aung San Market,...

  1. 7th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Krömer, Pavel; Snášel, Václav

    2014-01-01

    Genetic and Evolutionary Computing This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2013, the 7th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by The Waseda University in Japan, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2013 was held in Prague, Czech Republic. Prague is one of the most beautiful cities in the world whose magical atmosphere has been shaped over ten centuries. Places of the greatest tourist interest are on the Royal Route running from the Powder Tower through Celetna Street to Old Town Square, then across Charles Bridge through the Lesser Town up to the Hradcany Castle. One should not miss the Jewish Town, and the National Gallery with its fine collection of Czech Gothic art, collection of old European art, and a beautiful collection of French art. The conference was intended as an international forum for the res...

  2. 8th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Yang, Chin-Yu; Lin, Chun-Wei; Pan, Jeng-Shyang; Snasel, Vaclav; Abraham, Ajith

    2015-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2014, the 8th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Nanchang Institute of Technology in China, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2014 is held from 18-20 October 2014 in Nanchang, China. Nanchang is one of is the capital of Jiangxi Province in southeastern China, located in the north-central portion of the province. As it is bounded on the west by the Jiuling Mountains, and on the east by Poyang Lake, it is famous for its scenery, rich history and cultural sites. Because of its central location relative to the Yangtze and Pearl River Delta regions, it is a major railroad hub in Southern China. The conference is intended as an international forum for the researchers and professionals in all areas of genetic and evolutionary computing.

  3. An Evolutionary Computational Approach to Humanoid Motion Planning

    Directory of Open Access Journals (Sweden)

    Dhammika Suresh Hettiarachchi

    2012-11-01

    Full Text Available The theme of our work is centred on humanoid motion planning and balancing using evolutionary computational techniques. Evolutionary techniques, inspired by the Darwinian evolution of biological systems, make use of the concept of the iterative progress of a population of solutions with the aim of finding an optimally fit solution to a given problem. The problem we address here is that of asymmetric motion generation for humanoids, with the aim of automatically developing a series of motions to resemble certain predefined postures. An acceptable trajectory and stability is of the utmost concern in our work. In developing these motions, we are utilizing genetic algorithms coupled with heuristic knowledge of the problem domain. Unlike other types of robots, humanoids are complex in both construction and operation due to their myriad degrees of freedom and the difficulty of balancing on one or more limbs. The work presented in this paper includes the adopted methodology, experimental setup, results and an analysis of the outcome of a series of evolutionary experiments conducted for generating the said asymmetric motions.

  4. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  5. Evolutionary Computation for Sensor Planning: The Task Distribution Plan

    Directory of Open Access Journals (Sweden)

    Dunn Enrique

    2003-01-01

    Full Text Available Autonomous sensor planning is a problem of interest to scientists in the fields of computer vision, robotics, and photogrammetry. In automated visual tasks, a sensing planner must make complex and critical decisions involving sensor placement and the sensing task specification. This paper addresses the problem of specifying sensing tasks for a multiple manipulator workcell given an optimal sensor placement configuration. The problem is conceptually divided in two different phases: activity assignment and tour planning. To solve such problems, an optimization methodology based on evolutionary computation is developed. Operational limitations originated from the workcell configuration are considered using specialized heuristics as well as a floating-point representation based on the random keys approach. Experiments and performance results are presented.

  6. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  7. Protein 3D Structure Computed from Evolutionary Sequence Variation

    Science.gov (United States)

    Sheridan, Robert; Hopf, Thomas A.; Pagnani, Andrea; Zecchina, Riccardo; Sander, Chris

    2011-01-01

    The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing. In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy. We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues., including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7–4.8 Å Cα-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org). This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of protein

  8. Optimization and Assessment of Wavelet Packet Decompositions with Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Schell Thomas

    2003-01-01

    Full Text Available In image compression, the wavelet transformation is a state-of-the-art component. Recently, wavelet packet decomposition has received quite an interest. A popular approach for wavelet packet decomposition is the near-best-basis algorithm using nonadditive cost functions. In contrast to additive cost functions, the wavelet packet decomposition of the near-best-basis algorithm is only suboptimal. We apply methods from the field of evolutionary computation (EC to test the quality of the near-best-basis results. We observe a phenomenon: the results of the near-best-basis algorithm are inferior in terms of cost-function optimization but are superior in terms of rate/distortion performance compared to EC methods.

  9. Evolutionary Computing Based Area Integration PWM Technique for Multilevel Inverters

    Directory of Open Access Journals (Sweden)

    S. Jeevananthan

    2007-06-01

    Full Text Available The existing multilevel carrier-based pulse width modulation (PWM strategies have no special provisions to offer quality output, besides lower order harmonics are introduced in the spectrum, especially at low switching frequencies. This paper proposes a novel multilevel PWM strategy to corner the advantages of low frequency switching and reduced total harmonic distortion (THD. The basic idea of the proposed area integration PWM (AIPWM method is that the area of the required sinusoidal (fundamental output and the total area of the output pulses are made equal. An attempt is made to incorporate two soft computing techniques namely evolutionary programming (EP and genetic algorithm (GA in the generation and placement of switching pulses. The results of a prototype seven-level cascaded inverter experimented with the novel PWM strategies are presented.

  10. A Multi Agent System for Flow-Based Intrusion Detection Using Reputation and Evolutionary Computation

    Science.gov (United States)

    2011-03-01

    xiii I. Introduction ...47 MCDM multiple criteria decision making . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 MOEA Multi Objective Evolutionary Algorithm...COMPUTATION I. Introduction Surveying the modern digital expanse of the computer network for entities nefar- ious and profane is the work of an Intrusion

  11. Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation

    Science.gov (United States)

    Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred

    2008-01-01

    Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.

  12. Solving multi-objective water management problems using evolutionary computation.

    Science.gov (United States)

    Lewis, A; Randall, M

    2017-12-15

    Water as a resource is becoming increasingly more valuable given the changes in global climate. In an agricultural sense, the role of water is vital to ensuring food security. Therefore the management of it has become a subject of increasing attention and the development of effective tools to support participative decision-making in water management will be a valuable contribution. In this paper, evolutionary computation techniques and Pareto optimisation are incorporated in a model-based system for water management. An illustrative test case modelling optimal crop selection across dry, average and wet years based on data from the Murrumbidgee Irrigation Area in Australia is presented. It is shown that sets of trade-off solutions that provide large net revenues, or minimise environmental flow deficits can be produced rapidly, easily and automatically. The system is capable of providing detailed information on optimal solutions to achieve desired outcomes, responding to a variety of factors including climate conditions and economics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Application of evolutionary computation on ensemble forecast of quantitative precipitation

    Science.gov (United States)

    Dufek, Amanda S.; Augusto, Douglas A.; Dias, Pedro L. S.; Barbosa, Helio J. C.

    2017-09-01

    An evolutionary computation algorithm known as genetic programming (GP) has been explored as an alternative tool for improving the ensemble forecast of 24-h accumulated precipitation. Three GP versions and six ensembles' languages were applied to several real-world datasets over southern, southeastern and central Brazil during the rainy period from October to February of 2008-2013. According to the results, the GP algorithms performed better than two traditional statistical techniques, with errors 27-57% lower than simple ensemble mean and the MASTER super model ensemble system. In addition, the results revealed that GP algorithms outperformed the best individual forecasts, reaching an improvement of 34-42%. On the other hand, the GP algorithms had a similar performance with respect to each other and to the Bayesian model averaging, but the former are far more versatile techniques. Although the results for the six ensembles' languages are almost indistinguishable, our most complex linear language turned out to be the best overall proposal. Moreover, some meteorological attributes, including the weather patterns over Brazil, seem to play an important role in the prediction of daily rainfall amount.

  14. Computer-Aided Design for Electron Microscopy

    Czech Academy of Sciences Publication Activity Database

    Lencová, Bohumila

    2004-01-01

    Roč. 6, č. 1 (2004), s. 51-53 ISSN 1439-4243 Institutional research plan: CEZ:AV0Z2065902 Keywords : magnetic electron lenses * accuracy of computation * computer-aided design Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering

  15. An Analog Computer for Electronic Engineering Education

    Science.gov (United States)

    Fitch, A. L.; Iu, H. H. C.; Lu, D. D. C.

    2011-01-01

    This paper describes a compact analog computer and proposes its use in electronic engineering teaching laboratories to develop student understanding of applications in analog electronics, electronic components, engineering mathematics, control engineering, safe laboratory and workshop practices, circuit construction, testing, and maintenance. The…

  16. Speeding up ecological and evolutionary computations in R; essentials of high performance computing for biologists.

    Science.gov (United States)

    Visser, Marco D; McMahon, Sean M; Merow, Cory; Dixon, Philip M; Record, Sydne; Jongejans, Eelke

    2015-03-01

    Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1-S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research.

  17. Speeding up ecological and evolutionary computations in R; essentials of high performance computing for biologists.

    Directory of Open Access Journals (Sweden)

    Marco D Visser

    2015-03-01

    Full Text Available Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1-S3 Texts that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster. By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research.

  18. Computer Applications: Using Electronic Spreadsheets.

    Science.gov (United States)

    Riley, Connee; And Others

    This instructional unit is intended to assist teachers in helping students learn to use electronic spreadsheets. The 11 learning activities included, all of which are designed for use in conjunction with Multiplan Spreadsheet Software, are arranged in order of increasing difficulty. An effort has been made to include problems applicable to each of…

  19. Combined electronic structure and evolutionary search approach to materials design

    DEFF Research Database (Denmark)

    Johannesson, Gisli Holmar; Bligaard, Thomas; Ruban, Andrei

    2002-01-01

    We show that density functional theory calculations have reached an accuracy and speed making it possible to use them in conjunction with an evolutionary algorithm to search for materials with specific properties. The approach is illustrated by finding the most stable four component alloys out of...

  20. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    Science.gov (United States)

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  1. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  2. Computational Nanotechnology Molecular Electronics, Materials and Machines

    Science.gov (United States)

    Srivastava, Deepak; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    This presentation covers research being performed on computational nanotechnology, carbon nanotubes and fullerenes at the NASA Ames Research Center. Topics cover include: nanomechanics of nanomaterials, nanotubes and composite materials, molecular electronics with nanotube junctions, kinky chemistry, and nanotechnology for solid-state quantum computers using fullerenes.

  3. Investigation on Evolutionary Computation Techniques of a Nonlinear System

    Directory of Open Access Journals (Sweden)

    Tran Trong Dao

    2011-01-01

    Full Text Available The main aim of this work is to show that such a powerful optimizing tool like evolutionary algorithms (EAs can be in reality used for the simulation and optimization of a nonlinear system. A nonlinear mathematical model is required to describe the dynamic behaviour of batch process; this justifies the use of evolutionary method of the EAs to deal with this process. Four algorithms from the field of artificial intelligent—differential evolution (DE, self-organizing migrating algorithm (SOMA, genetic algorithm (GA, and simulated annealing (SA—are used in this investigation. The results show that EAs are used successfully in the process optimization.

  4. Progress in Computational Electron-Molecule Collisions

    Science.gov (United States)

    Rescigno, Tn

    1997-10-01

    The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.

  5. Using Evolutionary Computation to Solve the Economic Load Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Samir SAYAH

    2008-06-01

    Full Text Available This paper reports on an evolutionary algorithm based method for solving the economic load dispatch (ELD problem. The objective is to minimize the nonlinear function, which is the total fuel cost of thermal generating units, subject to the usual constraints.The IEEE 30 bus test system was used for testing and validation purposes. The results obtained demonstrate the effectiveness of the proposed method for solving the economic load dispatch problem.

  6. [The history of development of evolutionary methods in St. Petersburg school of computer simulation in biology].

    Science.gov (United States)

    Menshutkin, V V; Kazanskiĭ, A B; Levchenko, V F

    2010-01-01

    The history of rise and development of evolutionary methods in Saint Petersburg school of biological modelling is traced and analyzed. Some pioneering works in simulation of ecological and evolutionary processes, performed in St.-Petersburg school became an exemplary ones for many followers in Russia and abroad. The individual-based approach became the crucial point in the history of the school as an adequate instrument for construction of models of biological evolution. This approach is natural for simulation of the evolution of life-history parameters and adaptive processes in populations and communities. In some cases simulated evolutionary process was used for solving a reverse problem, i. e., for estimation of uncertain life-history parameters of population. Evolutionary computations is one more aspect of this approach application in great many fields. The problems and vistas of ecological and evolutionary modelling in general are discussed.

  7. Practical Applications of Evolutionary Computation to Financial Engineering Robust Techniques for Forecasting, Trading and Hedging

    CERN Document Server

    Iba, Hitoshi

    2012-01-01

    “Practical Applications of Evolutionary Computation to Financial Engineering” presents the state of the art techniques in Financial Engineering using recent results in Machine Learning and Evolutionary Computation. This book bridges the gap between academics in computer science and traders and explains the basic ideas of the proposed systems and the financial problems in ways that can be understood by readers without previous knowledge on either of the fields. To cement the ideas discussed in the book, software packages are offered that implement the systems described within. The book is structured so that each chapter can be read independently from the others. Chapters 1 and 2 describe evolutionary computation. The third chapter is an introduction to financial engineering problems for readers who are unfamiliar with this area. The following chapters each deal, in turn, with a different problem in the financial engineering field describing each problem in detail and focusing on solutions based on evolutio...

  8. EVOLVE : a Bridge between Probability, Set Oriented Numerics and Evolutionary Computation

    CERN Document Server

    Tantar, Alexandru-Adrian; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; Coello, Carlos; Schütze, Oliver; EVOLVE 2011

    2013-01-01

    The aim of this book is to provide a strong theoretical support for understanding and analyzing the behavior of evolutionary algorithms, as well as for creating a bridge between probability, set-oriented numerics and evolutionary computation. The volume encloses a collection of contributions that were presented at the EVOLVE 2011 international workshop, held in Luxembourg, May 25-27, 2011, coming from invited speakers and also from selected regular submissions. The aim of EVOLVE is to unify the perspectives offered by probability, set oriented numerics and evolutionary computation. EVOLVE focuses on challenging aspects that arise at the passage from theory to new paradigms and practice, elaborating on the foundations of evolutionary algorithms and theory-inspired methods merged with cutting-edge techniques that ensure performance guarantee factors. EVOLVE is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. The chapters enclose challenging theoret...

  9. Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2011-01-01

    Drift analysis is a powerful tool used to bound the optimization time of evolutionary algorithms (EAs). Various previous works apply a drift theorem going back to Hajek in order to show exponential lower bounds on the optimization time of EAs. However, this drift theorem is tedious to read...... involving the complicated theorem can be redone in a much simpler and clearer way. In some cases even improved results may be achieved. Therefore, the simplified theorem is also a didactical contribution to the runtime analysis of EAs....

  10. Toward an alternative evolutionary theory of religion: looking past computational evolutionary psychology to a wider field of possibilities.

    Science.gov (United States)

    Barrett, Nathaniel F

    2010-01-01

    Cognitive science of the last half-century has been dominated by the computational theory of mind and its picture of thought as information processing. Taking this picture for granted, the most prominent evolutionary theories of religion of the last fifteen years have sought to understand human religiosity as the product or by-product of universal information processing mechanisms that were adaptive in our ancestral environment. The rigidity of such explanations is at odds with the highly context-sensitive nature of historical studies of religion, and thus contributes to the apparent tug-of-war between scientific and humanistic perspectives. This essay argues that this antagonism stems in part from a deep flaw of computational theory, namely its notion of information as pre-given and context-free. In contrast, non-computational theories that picture mind as an adaptive, interactive process in which information is jointly constructed by organism and environment offer an alternative approach to an evolutionary understanding of human religiosity, one that is compatible with historical studies and amenable to a wide range of inquiries, including some limited kinds of theological inquiry.

  11. Single electron tunneling based arithmetic computation

    NARCIS (Netherlands)

    Lageweg, C.R.

    2004-01-01

    In this dissertation we investigate the implementation of computer arithmetic operations with Single Electron Tunneling (SET) technology based circuits. In our research we focus on the effective utilization of the SET technologys specific characteristic, i.e., the ability to control the transport of

  12. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    The task scheduling problem in heterogeneous distributed computing systems is a multiobjective optimization problem (MOP). In heterogeneous distributed computing systems (HDCS), there is a possibility of processor and network failures and this affects the applications running on the HDCS. To reduce the impact of ...

  13. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  14. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  15. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  16. Neuro-Inspired Computing with Stochastic Electronics

    KAUST Repository

    Naous, Rawan

    2016-01-06

    The extensive scaling and integration within electronic systems have set the standards for what is addressed to as stochastic electronics. The individual components are increasingly diverting away from their reliable behavior and producing un-deterministic outputs. This stochastic operation highly mimics the biological medium within the brain. Hence, building on the inherent variability, particularly within novel non-volatile memory technologies, paves the way for unconventional neuromorphic designs. Neuro-inspired networks with brain-like structures of neurons and synapses allow for computations and levels of learning for diverse recognition tasks and applications.

  17. An evolutionary computational approach for the dynamic Stackelberg competition problems

    Directory of Open Access Journals (Sweden)

    Lorena Arboleda-Castro

    2016-06-01

    Full Text Available Stackelberg competition models are an important family of economical decision problems from game theory, in which the main goal is to find optimal strategies between two competitors taking into account their hierarchy relationship. Although these models have been widely studied in the past, it is important to note that very few works deal with uncertainty scenarios, especially those that vary over time. In this regard, the present research studies this topic and proposes a computational method for solving efficiently dynamic Stackelberg competition models. The computational experiments suggest that the proposed approach is effective for problems of this nature.

  18. Intelligent Financial Portfolio Composition based on Evolutionary Computation Strategies

    CERN Document Server

    Gorgulho, Antonio; Horta, Nuno C G

    2013-01-01

    The management of financial portfolios or funds constitutes a widely known problematic in financial markets which normally requires a rigorous analysis in order to select the most profitable assets. This subject is becoming popular among computer scientists which try to adapt known Intelligent Computation techniques to the market’s domain. This book proposes a potential system based on Genetic Algorithms, which aims to manage a financial portfolio by using technical analysis indicators. The results are promising since the approach clearly outperforms the remaining approaches during the recent market crash.

  19. Reducing the Computational Cost in Multi-objective Evolutionary Algorithms by Filtering Worthless Individuals

    OpenAIRE

    Pourbahman, Zahra; Hamzeh, Ali

    2014-01-01

    The large number of exact fitness function evaluations makes evolutionary algorithms to have computational cost. In some real-world problems, reducing number of these evaluations is much more valuable even by increasing computational complexity and spending more time. To fulfill this target, we introduce an effective factor, in spite of applied factor in Adaptive Fuzzy Fitness Granulation with Non-dominated Sorting Genetic Algorithm-II, to filter out worthless individuals more precisely. Our ...

  20. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment

    Science.gov (United States)

    2014-01-01

    Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel

  1. Unscented Sampling Techniques For Evolutionary Computation With Applications To Astrodynamic Optimization

    Science.gov (United States)

    2016-09-01

    constrained optimization problems. The second goal is to improve computation times and efficiencies associated with evolutionary algorithms. The last goal is...to both genetic algorithms and evolution strategies to achieve these goals. The results of this research offer a promising new set of modified...computation, parallel processing, un - scented sampling 15. NUMBER OF PAGES 417 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18

  2. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    Science.gov (United States)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  3. Investigating the Multi-memetic Mind Evolutionary Computation Algorithm Efficiency

    Directory of Open Access Journals (Sweden)

    M. K. Sakharov

    2017-01-01

    Full Text Available In solving practically significant problems of global optimization, the objective function is often of high dimensionality and computational complexity and of nontrivial landscape as well. Studies show that often one optimization method is not enough for solving such problems efficiently - hybridization of several optimization methods is necessary.One of the most promising contemporary trends in this field are memetic algorithms (MA, which can be viewed as a combination of the population-based search for a global optimum and the procedures for a local refinement of solutions (memes, provided by a synergy. Since there are relatively few theoretical studies concerning the MA configuration, which is advisable for use to solve the black-box optimization problems, many researchers tend just to adaptive algorithms, which for search select the most efficient methods of local optimization for the certain domains of the search space.The article proposes a multi-memetic modification of a simple SMEC algorithm, using random hyper-heuristics. Presents the software algorithm and memes used (Nelder-Mead method, method of random hyper-sphere surface search, Hooke-Jeeves method. Conducts a comparative study of the efficiency of the proposed algorithm depending on the set and the number of memes. The study has been carried out using Rastrigin, Rosenbrock, and Zakharov multidimensional test functions. Computational experiments have been carried out for all possible combinations of memes and for each meme individually.According to results of study, conducted by the multi-start method, the combinations of memes, comprising the Hooke-Jeeves method, were successful. These results prove a rapid convergence of the method to a local optimum in comparison with other memes, since all methods perform the fixed number of iterations at the most.The analysis of the average number of iterations shows that using the most efficient sets of memes allows us to find the optimal

  4. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  5. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  6. Power optimization of wind turbines with data mining and evolutionary computation

    Energy Technology Data Exchange (ETDEWEB)

    Kusiak, Andrew; Zheng, Haiyang; Song, Zhe [Department of Mechanical and Industrial Engineering, The University of Iowa, 3131 Seamans Center, Iowa City, IA 52242-1527 (United States)

    2010-03-15

    A data-driven approach for maximization of the power produced by wind turbines is presented. The power optimization objective is accomplished by computing optimal control settings of wind turbines using data mining and evolutionary strategy algorithms. Data mining algorithms identify a functional mapping between the power output and controllable and non-controllable variables of a wind turbine. An evolutionary strategy algorithm is applied to determine control settings maximizing the power output of a turbine based on the identified model. Computational studies have demonstrated meaningful opportunities to improve the turbine power output by optimizing blade pitch and yaw angle. It is shown that the pitch angle is an important variable in maximizing energy captured from the wind. Power output can be increased by optimization of the pitch angle. The concepts proposed in this paper are illustrated with industrial wind farm data. (author)

  7. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Bhaskar, M; Panigrahi, Bijaya; Das, Swagatam

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems (ICAIECES -2015) held at Velammal Engineering College (VEC), Chennai, India during 22 – 23 April 2015. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academic and industry present their original work and exchange ideas, information, techniques and applications in the field of Communication, Computing and Power Technologies.

  8. Discovering Unique, Low-Energy Transition States Using Evolutionary Molecular Memetic Computing

    DEFF Research Database (Denmark)

    Ellabaan, Mostafa M Hashim; Ong, Y.S.; Handoko, S.D.

    2013-01-01

    be accurately identified through the transition states. Transition states describe the paths of molecular systems in transiting across stable states. In this article, we present the discovery of unique, low-energy transition states and showcase the efficacy of their identification using the memetic computing...... paradigm under a Molecular Memetic Computing (MMC) framework. In essence, the MMC is equipped with the tree-based representation of non-cyclic molecules and the covalent-bond-driven evolutionary operators, in addition to the typical backbone of memetic algorithms. Herein, we employ genetic algorithm...

  9. Evolutionary computation for the design of a stochastic switch for synthetic genetic circuits.

    Science.gov (United States)

    Hallinan, Jennifer S; Misirli, Goksel; Wipat, Anil

    2010-01-01

    Biological systems are inherently stochastic, a fact which is often ignored when simulating genetic circuits. Synthetic biology aims to design genetic circuits de novo, and cannot therefore afford to ignore the effects of stochastic behavior. Since computational design tools will be essential for large-scale synthetic biology, it is important to develop an understanding of the role of stochasticity in molecular biology, and incorporate this understanding into computational tools for genetic circuit design. We report upon an investigation into the combination of evolutionary algorithms and stochastic simulation for genetic circuit design, to design regulatory systems based on the Bacillus subtilis sin operon.

  10. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Vijayakumar, K; Panigrahi, Bijaya; Das, Swagatam

    2017-01-01

    The volume is a collection of high-quality peer-reviewed research papers presented in the International Conference on Artificial Intelligence and Evolutionary Computation in Engineering Systems (ICAIECES 2016) held at SRM University, Chennai, Tamilnadu, India. This conference is an international forum for industry professionals and researchers to deliberate and state their research findings, discuss the latest advancements and explore the future directions in the emerging areas of engineering and technology. The book presents original work and novel ideas, information, techniques and applications in the field of communication, computing and power technologies.

  11. Optimizing the configuration of magnetic confinement devices with evolutionary algorithms and grid computing

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Iglesias, A.; Vega-Rodriguez, M. A.; Castejon Mangana, C.; Rubio del Solar, M.; Cardenas Montes, M.

    2007-07-01

    In this paper we present a proposal for enhancing the configuration of a stellarator device in order to improve the performance of these fusion magnetic devices. To achieve this goal, we propose the use of grid computing with genetic and evolutionary algorithms. Grid computing allows performing many experiments in parallel way. Genetic algorithms allow avoiding for exploring the whole solution space because the number of parameters involved in the configuration of these devices and the number of combinations between these values make impossible to explore all the possibilities. (Author)

  12. Artificial Intelligence, Evolutionary Computing and Metaheuristics In the Footsteps of Alan Turing

    CERN Document Server

    2013-01-01

    Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation.  Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo sear...

  13. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.

    Science.gov (United States)

    Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra

    2018-01-04

    Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  15. A great disappearing act: the electronic analogue computer

    OpenAIRE

    Bissell, C. C.

    2004-01-01

    One historian of technology has called the analogue computer 'one of the great disappearing acts of the Twentieth Century'. This paper will look briefly at the origins, development and decline of the electronic analogue computer ..

  16. Management and Valorization of Electronic and Computer Wastes in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Because of their undeveloped condition, African countries receive tonnes of second-hand computers and electronic equipment from more advanced countries. More and more voices are calling for legislative and regulatory provisions to deal with the electronic and computer waste thus generated. So far, little is known about ...

  17. Chief Editor's column/The First Electronic Computer

    Indian Academy of Sciences (India)

    1946-02-14

    1996 is the fiftieth anniversary of the birth of the first electronic computer. On February 14, 1946 the. Electronic Numerical Integrator and Computer. (ENIAC) was formally switched on at the Moore. School of Electrical Engineering at the University of. Pennsylvania, U.S.A. ENIAC, designed by a team headed by John W ...

  18. Multi-memetic Mind Evolutionary Computation Algorithm for Loosely Coupled Systems of Desktop Computers

    Directory of Open Access Journals (Sweden)

    M. K. Sakharov

    2015-01-01

    Full Text Available This paper deals with the development and software implementation of the hybrid multi-memetic algorithm for distributed computing systems. The main algorithm is based on the modification of MEC algorithm proposed by the authors. The multi-memetic algorithm utilizes three various local optimization methods. Software implementation was developed using MPI for Python and tested on a grid network made of twenty desktop computers. Performance of the proposed algorithm and its software implementation was investigated using multi-dimensional multi-modal benchmark functions from CEC’14.

  19. Hybrid evolutionary computing model for mobile agents of wireless Internet multimedia

    Science.gov (United States)

    Hortos, William S.

    2001-03-01

    The ecosystem is used as an evolutionary paradigm of natural laws for the distributed information retrieval via mobile agents to allow the computational load to be added to server nodes of wireless networks, while reducing the traffic on communication links. Based on the Food Web model, a set of computational rules of natural balance form the outer stage to control the evolution of mobile agents providing multimedia services with a wireless Internet protocol WIP. The evolutionary model shows how mobile agents should behave with the WIP, in particular, how mobile agents can cooperate, compete and learn from each other, based on an underlying competition for radio network resources to establish the wireless connections to support the quality of service QoS of user requests. Mobile agents are also allowed to clone themselves, propagate and communicate with other agents. A two-layer model is proposed for agent evolution: the outer layer is based on the law of natural balancing, the inner layer is based on a discrete version of a Kohonen self-organizing feature map SOFM to distribute network resources to meet QoS requirements. The former is embedded in the higher OSI layers of the WIP, while the latter is used in the resource management procedures of Layer 2 and 3 of the protocol. Algorithms for the distributed computation of mobile agent evolutionary behavior are developed by adding a learning state to the agent evolution state diagram. When an agent is in an indeterminate state, it can communicate to other agents. Computing models can be replicated from other agents. Then the agents transitions to the mutating state to wait for a new information-retrieval goal. When a wireless terminal or station lacks a network resource, an agent in the suspending state can change its policy to submit to the environment before it transitions to the searching state. The agents learn the facts of agent state information entered into an external database. In the cloning process, two

  20. Evolutionary Game-Theoretic Solution for Virtual Routers with Padding Misbehavior in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xia-an Bi

    2015-01-01

    Full Text Available With the development of cloud computing and virtualization, a physical router can be multiplexed as a large number of virtual routers. TCP-based interactive applications have an incentive to improve their performance by padding “junk packets” into the network among real communication packets. This padding misbehavior will upgrade short TCP flows from “mice” to “elephants” and consequently lead to network congestion and breakdown. This paper presents a detailed solution and analysis for describing the normal behavior and padding misbehavior of virtual routers. In particular, a system model for analyzing behavior of virtual routers is based on evolutionary game model, and, through analyzing the stability of the equilibrium points, the stable point is the solution to the problem. The clear evolutionary path of network applications with the normal behavior and padding misbehavior is analyzed by the corresponding graph. Then this paper gives the behavior control suggestions to effectively restrain the padding misbehavior and maintain stable high-throughputs of the router. The simulation results demonstrate that our solution can effectively restrain the padding misbehavior and maintain stable high-throughputs of the router simultaneously compared with the classical queue management.

  1. Electronic digital computers their use in science and engineering

    CERN Document Server

    Alt, Franz L

    1958-01-01

    Electronic Digital Computers: Their Use in Science and Engineering describes the principles underlying computer design and operation. This book describes the various applications of computers, the stages involved in using them, and their limitations. The machine is composed of the hardware which is run by a program. This text describes the use of magnetic drum for storage of data and some computing. The functions and components of the computer include automatic control, memory, input of instructions by using punched cards, and output from resulting information. Computers operate by using numbe

  2. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  3. Evolutionary Game Analysis of Competitive Information Dissemination on Social Networks: An Agent-Based Computational Approach

    Directory of Open Access Journals (Sweden)

    Qing Sun

    2015-01-01

    Full Text Available Social networks are formed by individuals, in which personalities, utility functions, and interaction rules are made as close to reality as possible. Taking the competitive product-related information as a case, we proposed a game-theoretic model for competitive information dissemination in social networks. The model is presented to explain how human factors impact competitive information dissemination which is described as the dynamic of a coordination game and players’ payoff is defined by a utility function. Then we design a computational system that integrates the agent, the evolutionary game, and the social network. The approach can help to visualize the evolution of % of competitive information adoption and diffusion, grasp the dynamic evolution features in information adoption game over time, and explore microlevel interactions among users in different network structure under various scenarios. We discuss several scenarios to analyze the influence of several factors on the dissemination of competitive information, ranging from personality of individuals to structure of networks.

  4. An evolutionary examination of telemedicine: a health and computer-mediated communication perspective.

    Science.gov (United States)

    Breen, Gerald-Mark; Matusitz, Jonathan

    2010-01-01

    Telemedicine, the use of advanced communication technologies in the healthcare context, has a rich history and a clear evolutionary course. In this paper, the authors identify telemedicine as operationally defined, the services and technologies it comprises, the direction telemedicine has taken, along with its increased acceptance in the healthcare communities. The authors also describe some of the key pitfalls warred with by researchers and activists to advance telemedicine to its full potential and lead to an unobstructed team of technicians to identify telemedicine's diverse utilities. A discussion and future directions section is included to provide fresh ideas to health communication and computer-mediated scholars wishing to delve into this area and make a difference to enhance public understanding of this field.

  5. National electronic medical records integration on cloud computing system.

    Science.gov (United States)

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  6. Resolution Versus Error for Computational Electron Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Luzi, Lorenzo; Stevens, Andrew; Yang, Hao; Browning, Nigel D.

    2017-07-01

    Images that are collected via scanning transmission electron microscopy (STEM) can be undersampled to avoid damage to the specimen while maintaining resolution [1, 2]. We have used BPFA to impute missing data and reduce noise [3]. The reconstruction is typically evaluated using the peak signal-to-noise ratio (PSNR). This measure is too conservative for STEM images and we propose that the Fourier ring correlation (FRC) is used instead to evaluate the reconstruction. We are not concerned with exact reconstruction of the truth image, and therefore PSNR is a conservative estimation of the quality of the reconstruction. Instead, we are concerned with the visual resolution of the image and whether atoms can be distinguished. We have evaluated the reconstruction of a simulated STEM image using the FRC and compared the results with the PSNR measurements. The FRC captures the resolution of the image and is not affected by a large MSE if the atom peaks are still distinguishable. The noisy and reconstructed images are shown in Figure 1. The simulated STEM image was sampled at 100%, 80%, 40%, and 20% of the original pixels to simulate an undersampled scan. The reconstruction was done using BPFA with a patch size of 10 x 10 and no overlapping patches. Not having overlapping patches produces inferior results but they are still acceptable. The dictionary size is 64 and 30 iterations were completed during each reconstruction. The 100% image was denoised instead of reconstructed. Poisson noise was applied to the simulated image with λ values of 500, 50, and 5 to simulate lower imaging dose. The original simulated STEM image was also included in our calculations and was generated using a dose of 1000. The simulated STEM image is 100 by 100 pixels and has essentially no high frequency components. The image reconstruction tends to smooth the data, also resulting in no high frequency components. This causes the FRC of the two images to be large at higher resolutions and may be

  7. Computing the Quartet Distance Between Evolutionary Trees in Time O(n log n)

    DEFF Research Database (Denmark)

    Brodal, Gerth Sølfting; Fagerberg, Rolf; Pedersen, Christian Nørgaard Storm

    2003-01-01

    Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, McMorris, and ...... unrooted evolutionary trees of n species, where all internal nodes have degree three, in time O(n log n. The previous best algorithm for the problem uses time O(n 2)....

  8. Electron Gun for Computer-controlled Welding of Small Components

    Czech Academy of Sciences Publication Activity Database

    Dupák, Jan; Vlček, Ivan; Zobač, Martin

    2001-01-01

    Roč. 62, 2-3 (2001), s. 159-164 ISSN 0042-207X R&D Projects: GA AV ČR IBS2065015 Institutional research plan: CEZ:AV0Z2065902 Keywords : Electron beam-welding machine * Electron gun * Computer- control led beam Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.541, year: 2001

  9. Computer Friends and Foes: Content of Undergraduates' Electronic Mail.

    Science.gov (United States)

    McCormick, Naomi B.; McCormick, John W.

    1992-01-01

    Discussion of computer-mediated communication focuses on a study of undergraduates in computer science courses that used observational and self-report techniques to examine the content of electronic mail messages. Work-related and various social uses are described, and examples of messages are included. (42 references) (LRW)

  10. An evolutionary computational theory of prefrontal executive function in decision-making.

    Science.gov (United States)

    Koechlin, Etienne

    2014-11-05

    The prefrontal cortex subserves executive control and decision-making, that is, the coordination and selection of thoughts and actions in the service of adaptive behaviour. We present here a computational theory describing the evolution of the prefrontal cortex from rodents to humans as gradually adding new inferential Bayesian capabilities for dealing with a computationally intractable decision problem: exploring and learning new behavioural strategies versus exploiting and adjusting previously learned ones through reinforcement learning (RL). We provide a principled account identifying three inferential steps optimizing this arbitration through the emergence of (i) factual reactive inferences in paralimbic prefrontal regions in rodents; (ii) factual proactive inferences in lateral prefrontal regions in primates and (iii) counterfactual reactive and proactive inferences in human frontopolar regions. The theory clarifies the integration of model-free and model-based RL through the notion of strategy creation. The theory also shows that counterfactual inferences in humans yield to the notion of hypothesis testing, a critical reasoning ability for approximating optimal adaptive processes and presumably endowing humans with a qualitative evolutionary advantage in adaptive behaviour.

  11. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  12. Computational Study of Electron Delocalization in Hexaarylbenzenes

    Directory of Open Access Journals (Sweden)

    Citlalli Rios

    2014-03-01

    Full Text Available A number of hexaarylbenzene compounds were studied theoretically, in order to compare energy changes as a result of the toroidal delocalization effect that is characteristic of all these species. The energy was studied taking advantage of locally designed isodesmic reactions. Results indicate that the amount of aromaticity manifested by each substituent is a factor that should be considered when assessing the quantity of energy dissipated from each aromatic center. The influence of different substituents on electronic delocalization is also analyzed, as well as the role played by their frontier molecular orbitals.

  13. Quantum Computing with an Electron Spin Ensemble

    DEFF Research Database (Denmark)

    Wesenberg, Janus; Ardavan, A.; Briggs, G.A.D.

    2009-01-01

    We propose to encode a register of quantum bits in different collective electron spin wave excitations in a solid medium. Coupling to spins is enabled by locating them in the vicinity of a superconducting transmission line cavity, and making use of their strong collective coupling to the quantized...... radiation field. The transformation between different spin waves is achieved by applying gradient magnetic fields across the sample, while a Cooper pair box, resonant with the cavity field, may be used to carry out one- and two-qubit gate operations....

  14. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  15. Computational study of evolutionary selection pressure on rainbow trout estrogen receptors.

    Directory of Open Access Journals (Sweden)

    Conrad Shyu

    2010-03-01

    Full Text Available Molecular dynamics simulations were used to determine the binding affinities between the hormone 17-estradiol (E2 and different estrogen receptor (ER isoforms in the rainbow trout, Oncorhynchus mykiss. Previous phylogenetic analysis indicates that a whole genome duplication prior to the divergence of ray-finned fish led to two distinct ER isoforms, ER and ER, and the recent whole genome duplication in the ancestral salmonid created two ER isoforms, ER and ER. The objective of our computational studies is to provide insight into the underlying evolutionary pressures on these isoforms. For the ER subtype our results show that E2 binds preferentially to ER over ER. Tests of lineage specific N/S ratios indicate that the ligand binding domain of the ER gene is evolving under relaxed selection relative to all other ER genes. Comparison with the highly conserved DNA binding domain suggests that ER may be undergoing neofunctionalization possibly by binding to another ligand. By contrast, both ER and ER bind similarly to E2 and the best fitting model of selection indicates that the ligand binding domain of all ER genes are evolving under the same level of purifying selection, comparable to ER.

  16. Computational study of evolutionary selection pressure on rainbow trout estrogen receptors.

    Science.gov (United States)

    Shyu, Conrad; Brown, Celeste J; Ytreberg, F Marty

    2010-03-09

    Molecular dynamics simulations were used to determine the binding affinities between the hormone 17-estradiol (E2) and different estrogen receptor (ER) isoforms in the rainbow trout, Oncorhynchus mykiss. Previous phylogenetic analysis indicates that a whole genome duplication prior to the divergence of ray-finned fish led to two distinct ER isoforms, ER and ER, and the recent whole genome duplication in the ancestral salmonid created two ER isoforms, ER and ER. The objective of our computational studies is to provide insight into the underlying evolutionary pressures on these isoforms. For the ER subtype our results show that E2 binds preferentially to ER over ER. Tests of lineage specific N/S ratios indicate that the ligand binding domain of the ER gene is evolving under relaxed selection relative to all other ER genes. Comparison with the highly conserved DNA binding domain suggests that ER may be undergoing neofunctionalization possibly by binding to another ligand. By contrast, both ER and ER bind similarly to E2 and the best fitting model of selection indicates that the ligand binding domain of all ER genes are evolving under the same level of purifying selection, comparable to ER.

  17. Using evolutionary computation to optimize an SVM used in detecting buried objects in FLIR imagery

    Science.gov (United States)

    Paino, Alex; Popescu, Mihail; Keller, James M.; Stone, Kevin

    2013-06-01

    In this paper we describe an approach for optimizing the parameters of a Support Vector Machine (SVM) as part of an algorithm used to detect buried objects in forward looking infrared (FLIR) imagery captured by a camera installed on a moving vehicle. The overall algorithm consists of a spot-finding procedure (to look for potential targets) followed by the extraction of several features from the neighborhood of each spot. The features include local binary pattern (LBP) and histogram of oriented gradients (HOG) as these are good at detecting texture classes. Finally, we project and sum each hit into UTM space along with its confidence value (obtained from the SVM), producing a confidence map for ROC analysis. In this work, we use an Evolutionary Computation Algorithm (ECA) to optimize various parameters involved in the system, such as the combination of features used, parameters on the Canny edge detector, the SVM kernel, and various HOG and LBP parameters. To validate our approach, we compare results obtained from an SVM using parameters obtained through our ECA technique with those previously selected by hand through several iterations of "guess and check".

  18. Exploiting genomic knowledge in optimising molecular breeding programmes: algorithms from evolutionary computing.

    Directory of Open Access Journals (Sweden)

    Steve O'Hagan

    Full Text Available Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock, nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC, but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic search space (G-algorithms with some (albeit well-tuned ones that do not (F-algorithms. For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any 'prior knowledge' of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information.

  19. High performance computing in structural determination by electron cryomicroscopy.

    Science.gov (United States)

    Fernández, J J

    2008-10-01

    Computational advances have significantly contributed to the current role of electron cryomicroscopy (cryoEM) in structural biology. The needs for computational power are constantly growing with the increasing complexity of algorithms and the amount of data needed to push the resolution limits. High performance computing (HPC) is becoming paramount in cryoEM to cope with those computational needs. Since the nineties, different HPC strategies have been proposed for some specific problems in cryoEM and, in fact, some of them are already available in common software packages. Nevertheless, the literature is scattered in the areas of computer science and structural biology. In this communication, the HPC approaches devised for the computation-intensive tasks in cryoEM (single particles and tomography) are retrospectively reviewed and the future trends are discussed. Moreover, the HPC capabilities available in the most common cryoEM packages are surveyed, as an evidence of the importance of HPC in addressing the future challenges.

  20. Computation of electron energy loss spectra by an iterative method

    Energy Technology Data Exchange (ETDEWEB)

    Koval, Peter [Donostia International Physics Center (DIPC), Paseo Manuel de Lardizabal 4, E-20018 San Sebastián (Spain); Centro de Física de Materiales CFM-MPC, Centro Mixto CSIC-UPV/EHU, Paseo Manuel de Lardizabal 5, E-20018 San Sebastián (Spain); Ljungberg, Mathias Per [Donostia International Physics Center (DIPC), Paseo Manuel de Lardizabal 4, E-20018 San Sebastián (Spain); Foerster, Dietrich [LOMA, Université de Bordeaux 1, 351 Cours de la Liberation, 33405 Talence (France); Sánchez-Portal, Daniel [Donostia International Physics Center (DIPC), Paseo Manuel de Lardizabal 4, E-20018 San Sebastián (Spain); Centro de Física de Materiales CFM-MPC, Centro Mixto CSIC-UPV/EHU, Paseo Manuel de Lardizabal 5, E-20018 San Sebastián (Spain)

    2015-07-01

    A method is presented to compute the dielectric function for extended systems using linear response time-dependent density functional theory. Localized basis functions with finite support are used to expand both eigenstates and response functions. The electron-energy loss function is directly obtained by an iterative Krylov-subspace method. We apply our method to graphene and silicon and compare it to plane-wave based approaches. Finally, we compute electron-energy loss spectrum of C{sub 60} crystal to demonstrate the merits of the method for molecular crystals, where it will be most competitive.

  1. Using Electrons on Liquid Helium for Quantum Computing

    OpenAIRE

    Dahm, A. J.; Goodkind, J. M.; Karakurt, I.; Pilla, S.

    2001-01-01

    We describe a quantum computer based on electrons supported by a helium film and localized laterally by small electrodes just under the helium surface. Each qubit is made of combinations of the ground and first excited state of an electron trapped in the image potential well at the surface. Mechanisms for preparing the initial state of the qubit, operations with the qubits, and a proposed readout are described. This system is, in principle, capable of 100,000 operations in a decoherence time.

  2. Simulation of scanning transmission electron microscope images on desktop computers

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, C., E-mail: christian.dwyer@mcem.monash.edu.au [Monash Centre for Electron Microscopy, Department of Materials Engineering, Monash University, Victoria 3800 (Australia)

    2010-02-15

    Two independent strategies are presented for reducing the computation time of multislice simulations of scanning transmission electron microscope (STEM) images: (1) optimal probe sampling, and (2) the use of desktop graphics processing units. The first strategy is applicable to STEM images generated by elastic and/or inelastic scattering, and requires minimal effort for its implementation. Used together, these two strategies can reduce typical computation times from days to hours, allowing practical simulation of STEM images of general atomic structures on a desktop computer.

  3. Brain-Computer Evolutionary Multi-Objective Optimization (BC-EMO): a genetic algorithm adapting to the decision maker

    OpenAIRE

    Battiti, Roberto; Passerini, Andrea

    2009-01-01

    The centrality of the decision maker (DM) is widely recognized in the Multiple Criteria Decision Making community. This translates into emphasis on seamless human-computer interaction, and adaptation of the solution technique to the knowledge which is progressively acquired from the DM. This paper adopts the methodology of Reactive Optimization(RO) for evolutionary interactive multi-objective optimization. RO follows to the paradigm of "learning while optimizing", through the use of online ma...

  4. Regional Platform on Personal Computer Electronic Waste in Latin ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Donation of personal computers - whether from Northern to Southern countries or from government or the private sector to civil society organizations - has resulted in large quantities of electronic waste (e-waste) in developing and transition countries. The quantity and toxicity of e-waste is posing increasing occupational and ...

  5. Electron beam computed tomography for the diagnosis of cardiac ...

    African Journals Online (AJOL)

    diagnosis of cardiac disease. Yadon Arad. Electron beam computed tomography (EBCT) of the heart is a new modality which will alter the way cardiology is practised. ... cardiovascular disease is either an acute myocardial infarction (MI) or ... Department of Preventive Cardiology, St Francis Hospital, Roslyn,. NY,USA.

  6. 78 FR 1247 - Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media...

    Science.gov (United States)

    2013-01-08

    ... COMMISSION Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media... importation of certain electronic devices, including wireless communication devices, tablet computers, media... United States after importation of certain electronic devices, including wireless communication devices...

  7. Computer modeling of electron and proton transport in chloroplasts.

    Science.gov (United States)

    Tikhonov, Alexander N; Vershubskii, Alexey V

    2014-07-01

    Photosynthesis is one of the most important biological processes in biosphere, which provides production of organic substances from atmospheric CO2 and water at expense of solar energy. In this review, we contemplate computer models of oxygenic photosynthesis in the context of feedback regulation of photosynthetic electron transport in chloroplasts, the energy-transducing organelles of the plant cell. We start with a brief overview of electron and proton transport processes in chloroplasts coupled to ATP synthesis and consider basic regulatory mechanisms of oxygenic photosynthesis. General approaches to computer simulation of photosynthetic processes are considered, including the random walk models of plastoquinone diffusion in thylakoid membranes and deterministic approach to modeling electron transport in chloroplasts based on the mass action law. Then we focus on a kinetic model of oxygenic photosynthesis that includes key stages of the linear electron transport, alternative pathways of electron transfer around photosystem I (PSI), transmembrane proton transport and ATP synthesis in chloroplasts. This model includes different regulatory processes: pH-dependent control of the intersystem electron transport, down-regulation of photosystem II (PSII) activity (non-photochemical quenching), the light-induced activation of the Bassham-Benson-Calvin (BBC) cycle. The model correctly describes pH-dependent feedback control of electron transport in chloroplasts and adequately reproduces a variety of experimental data on induction events observed under different experimental conditions in intact chloroplasts (variations of CO2 and O2 concentrations in atmosphere), including a complex kinetics of P700 (primary electron donor in PSI) photooxidation, CO2 consumption in the BBC cycle, and photorespiration. Finally, we describe diffusion-controlled photosynthetic processes in chloroplasts within the framework of the model that takes into account complex architecture of

  8. Ecoupling server: A tool to compute and analyze electronic couplings.

    Science.gov (United States)

    Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor

    2016-07-05

    Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Computer simulation of cluster impact induced electronic excitation of solids

    Energy Technology Data Exchange (ETDEWEB)

    Weidtmann, B.; Hanke, S.; Duvenbeck, A. [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany); Wucher, A., E-mail: andreas.wucher@uni-deu.de [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany)

    2013-05-15

    We present a computational study of electronic excitation upon bombardment of a metal surface with cluster projectiles. Our model employs a molecular dynamics (MD) simulation to calculate the particle dynamics following the projectile impact. Kinetic excitation is implemented via two mechanisms describing the electronic energy loss of moving particles: autoionization in close binary collisions and a velocity proportional friction force resulting from direct atom–electron collisions. Two different friction models are compared with respect to the predicted sputter yields after single atom and cluster bombardment. We find that a density dependent friction coefficient leads to a significant reduction of the total energy transferred to the electronic sub-system as compared to the Lindhard friction model, thereby strongly enhancing the predicted sputter yield under cluster bombardment conditions. In contrast, the yield predicted for monoatomic projectile bombardment remains practically unchanged.

  10. Merging molecular mechanism and evolution: theory and computation at the interface of biophysics and evolutionary population genetics.

    Science.gov (United States)

    Serohijos, Adrian W R; Shakhnovich, Eugene I

    2014-06-01

    The variation among sequences and structures in nature is both determined by physical laws and by evolutionary history. However, these two factors are traditionally investigated by disciplines with different emphasis and philosophy-molecular biophysics on one hand and evolutionary population genetics in another. Here, we review recent theoretical and computational approaches that address the crucial need to integrate these two disciplines. We first articulate the elements of these approaches. Then, we survey their contribution to our mechanistic understanding of molecular evolution, the polymorphisms in coding region, the distribution of fitness effects (DFE) of mutations, the observed folding stability of proteins in nature, and the distribution of protein folds in genomes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A quantum computer based on electrons floating on liquid helium

    OpenAIRE

    Dykman, M. I.; Platzman, P. M.

    2001-01-01

    Electrons on a helium surface form a quasi two-dimensional system which displays the highest mobility reached in condensed matter physics. We propose to use this system as a set of interacting quantum bits. We will briefly describe the system and discuss how the qubits can be addressed and manipulated, including interqubit excitation transfer. The working frequency of the proposed quantum computer is ~1GHz. The relaxation rate can be at least 5 orders of magnitude smaller, for low temperatures.

  12. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  13. Equilibrium selection in alternating-offers bargaining models: the evolutionary computing approach

    NARCIS (Netherlands)

    D.D.B. van Bragt; E.H. Gerding (Enrico); J.A. La Poutré (Han)

    2000-01-01

    textabstractA systematic validation of evolutionary techniques in the field of bargaining is presented. For this purpose, the dynamic and equilibrium-selecting behavior of a multi-agent system consisting of adaptive bargaining agents is investigated. The agents' bargaining strategies are updated by

  14. EvoluZion: A Computer Simulator for Teaching Genetic and Evolutionary Concepts

    Science.gov (United States)

    Zurita, Adolfo R.

    2017-01-01

    EvoluZion is a forward-in-time genetic simulator developed in Java and designed to perform real time simulations on the evolutionary history of virtual organisms. These model organisms harbour a set of 13 genes that codify an equal number of phenotypic features. These genes change randomly during replication, and mutant genes can have null,…

  15. Computational Models of Financial Price Prediction: A Survey of Neural Networks, Kernel Machines and Evolutionary Computation Approaches

    Directory of Open Access Journals (Sweden)

    Javier Sandoval

    2011-12-01

    Full Text Available A review of the representative models of machine learning research applied to the foreign exchange rate and stock price prediction problem is conducted.  The article is organized as follows: The first section provides a context on the definitions and importance of foreign exchange rate and stock markets.  The second section reviews machine learning models for financial prediction focusing on neural networks, SVM and evolutionary methods. Lastly, the third section draws some conclusions.

  16. Combined bio-inspired/evolutionary computational methods in cross-layer protocol optimization for wireless ad hoc sensor networks

    Science.gov (United States)

    Hortos, William S.

    2011-06-01

    Published studies have focused on the application of one bio-inspired or evolutionary computational method to the functions of a single protocol layer in a wireless ad hoc sensor network (WSN). For example, swarm intelligence in the form of ant colony optimization (ACO), has been repeatedly considered for the routing of data/information among nodes, a network-layer function, while genetic algorithms (GAs) have been used to select transmission frequencies and power levels, physical-layer functions. Similarly, artificial immune systems (AISs) as well as trust models of quantized data reputation have been invoked for detection of network intrusions that cause anomalies in data and information; these act on the application and presentation layers. Most recently, a self-organizing scheduling scheme inspired by frog-calling behavior for reliable data transmission in wireless sensor networks, termed anti-phase synchronization, has been applied to realize collision-free transmissions between neighboring nodes, a function of the MAC layer. In a novel departure from previous work, the cross-layer approach to WSN protocol design suggests applying more than one evolutionary computational method to the functions of the appropriate layers to improve the QoS performance of the cross-layer design beyond that of one method applied to a single layer's functions. A baseline WSN protocol design, embedding GAs, anti-phase synchronization, ACO, and a trust model based on quantized data reputation at the physical, MAC, network, and application layers, respectively, is constructed. Simulation results demonstrate the synergies among the bioinspired/ evolutionary methods of the proposed baseline design improve the overall QoS performance of networks over that of a single computational method.

  17. Computational aspects of electronic transport in nanoscale devices

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg

    2008-01-01

    is for the calculation of the block tridiagonal matrix inverse of a block tridiagonal matrix in O(N) operations. This algorithm also leads to an optimal evaluation of the frequently used Caroli transmission formula. A modified wave function matching scheme is then developed which allows for a significant reduction......This thesis is concerned with the modeling of electronic properties of nano-scale devices. In particular the computational aspects of calculating the transmission and current-voltage characteristics of Landauer-Büttiker two-probe systems are in focus. To begin with, the main existing methods...... are described in detail and benchmarked. These are the Green’s function method and the wave function matching method. The methods are subsequently combined in a hybrid scheme in order to benefit from a common formalism. The most time demanding stages of common electronic transport calculations are identified...

  18. Computational electronics semiclassical and quantum device modeling and simulation

    CERN Document Server

    Vasileska, Dragica; Klimeck, Gerhard

    2010-01-01

    Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of

  19. Computational aspects of electronic transport in nanoscale devices

    OpenAIRE

    Sørensen, Hans Henrik Brandenborg; Hansen, Per Christian; Stokbro, Kurt

    2008-01-01

    This thesis is concerned with the modeling of electronic properties of nano-scale devices. In particular the computational aspects of calculating the transmission and current-voltage characteristics of Landauer-Büttiker two-probe systems are in focus. To begin with, the main existing methods are described in detail and benchmarked. These are the Green’s function method and the wave function matching method. The methods are subsequently combined in a hybrid scheme in order to benefit from a co...

  20. Search of computers for discovery of electronic evidence

    Directory of Open Access Journals (Sweden)

    Pisarić Milana M.

    2015-01-01

    Full Text Available In order to address the specific nature of criminal activities committed using computer networks and systems, the efforts of states to adapt or complement the existing criminal law with purposeful provisions is understandable. To create an appropriate legal framework for supressing cybercrime, except the rules of substantive criminal law predict certain behavior as criminal offenses against the confidentiality, integrity and availability of computer data, computer systems and networks, it is essential that the provisions of the criminal procedure law contain adequate powers of competent authorities for detecting sources of illegal activities, or the collection of data on the committed criminal offense and offender, which can be used as evidence in criminal proceedings, taking into account the specificities of cyber crime and the environment within which the illegal activity is undertaken. Accordingly, the provisions of the criminal procedural law should be designed to be able to overcome certain challenges in discovering and proving high technology crime, and the provisions governing search of computer for discovery of electronic evidence is of special importance.

  1. Collaborative Computational Project for Electron cryo-Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Chris; Burnley, Tom [Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom); Patwardhan, Ardan [European Molecular Biology Laboratory, Wellcome Trust Genome Campus, Hinxton, Cambridge CB10 1SD (United Kingdom); Scheres, Sjors [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH (United Kingdom); Topf, Maya [University of London, Malet Street, London WC1E 7HX (United Kingdom); Roseman, Alan [University of Manchester, Oxford Road, Manchester M13 9PT (United Kingdom); Winn, Martyn, E-mail: martyn.winn@stfc.ac.uk [Science and Technology Facilities Council, Daresbury Laboratory, Warrington WA4 4AD (United Kingdom); Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom)

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed.

  2. The Molecular Electronic Device and the Biochip Computer: Present Status

    Science.gov (United States)

    Haddon, R. C.; Lamola, A. A.

    1985-04-01

    The idea that a single molecule might function as a self-contained electronic device has been of interest for some time. However, a fully integrated version--the biochip or the biocomputer, in which both production and assembly of molecular electronic components is achieved through biotechnology--is a relatively new concept that is currently attracting attention both within the scientific community and among the general public. In the present article we draw together some of the approaches being considered for the construction of such devices and delineate the revolutionary nature of the current proposals for molecular electronic devices (MEDs) and biochip computers (BCCs). With the silicon semiconductor industry already in place and in view of the continuing successes of the lithographic process it seems appropriate to ask why the highly speculative MED or BCC has engendered such interest. In some respects the answer is paradigmatic as much as it is real. It is perhaps best stated as the promise of the realm of the molecular. Thus it is envisioned that devices will be constructed by assembly of individual molecular electronic components into arrays, thereby engineering from small upward rather than large downward as do current lithographic techniques. An important corollary of the construction technique is that the functional elements of such an array would be individual molecules rather than macroscopic ensembles. These two aspects of the MED/BCC--assembly of molecular arrays and individually accessible functional molecular units--are truly revolutionary. Both require scientific breakthroughs and the necessary principles, quite apart from the technology, remain essentially unknown. It is concluded that the advent of the MED/BCC still lies well before us. The twin criteria of utilization of individual molecules as functional elements and the assembly of such elements remains as elusive as ever. Biology engineers structures on the molecular scale but biomolecules

  3. Evolutionary Information Theory

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2013-04-01

    Full Text Available Evolutionary information theory is a constructive approach that studies information in the context of evolutionary processes, which are ubiquitous in nature and society. In this paper, we develop foundations of evolutionary information theory, building several measures of evolutionary information and obtaining their properties. These measures are based on mathematical models of evolutionary computations, machines and automata. To measure evolutionary information in an invariant form, we construct and study universal evolutionary machines and automata, which form the base for evolutionary information theory. The first class of measures introduced and studied in this paper is evolutionary information size of symbolic objects relative to classes of automata or machines. In particular, it is proved that there is an invariant and optimal evolutionary information size relative to different classes of evolutionary machines. As a rule, different classes of algorithms or automata determine different information size for the same object. The more powerful classes of algorithms or automata decrease the information size of an object in comparison with the information size of an object relative to weaker4 classes of algorithms or machines. The second class of measures for evolutionary information in symbolic objects is studied by introduction of the quantity of evolutionary information about symbolic objects relative to a class of automata or machines. To give an example of applications, we briefly describe a possibility of modeling physical evolution with evolutionary machines to demonstrate applicability of evolutionary information theory to all material processes. At the end of the paper, directions for future research are suggested.

  4. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  5. Insights into Proton-Coupled Electron Transfer from Computation

    Science.gov (United States)

    Provorse, Makenzie R.

    Proton-coupled electron transfer (PCET) is utilized throughout Nature to facilitate essential biological processes, such as photosynthesis, cellular respiration, and DNA replication and repair. The general approach to studying PCET processes is based on a two-dimensional More O'Ferrall-Jencks diagram in which electron transfer (ET) and proton transfer (PT) occur in a sequential or concerted fashion. Experimentally, it is difficult to discern the contributing factors of concerted PCET mechanisms. Several theoretical approaches have arisen to qualitatively and quantitatively investigate these reactions. Here, we present a multistate density functional theory (MSDFT) method to efficiently and accurately model PCET mechanisms. The MSDFT method is validated against experimental and computational data previously reported on an isoelectronic series of small molecule self-exchange hydrogen atom transfer reactions and a model complex specifically designed to study long-range ET through a hydrogen-bonded salt-bridge interface. Further application of this method to the hydrogen atom abstraction of ascorbate by a nitroxyl radical demonstrates the sensitivity of the thermodynamic and kinetic properties to solvent effects. In particular, the origin of the unusual kinetic isotope effect is investigated. Lastly, the MSDFT is employed in a combined quantum mechanical/molecular mechanical (QM/MM) approach to explicitly model PCET in condensed phases.

  6. Cloud glaciation temperature estimation from passive remote sensing data with evolutionary computing

    Science.gov (United States)

    Carro-Calvo, L.; Hoose, C.; Stengel, M.; Salcedo-Sanz, S.

    2016-11-01

    The phase partitioning between supercooled liquid water and ice in clouds in the temperature range between 0 and -37°C influences their optical properties and the efficiency of precipitation formation. Passive remote sensing observations provide long-term records of the cloud top phase at a high spatial resolution. Based on the assumption of a cumulative Gaussian distribution of the ice cloud fraction as a function of temperature, we quantify the cloud glaciation temperature (CGT) as the 50th percentile of the fitted distribution function and its variance for different cloud top pressure intervals, obtained by applying an evolutionary algorithm (EA). EAs are metaheuristics approaches for optimization, used in difficult problems where standard approaches are either not applicable or show poor performance. In this case, the proposed EA is applied to 4 years of Pathfinder Atmospheres-Extended (PATMOS-x) data, aggregated into boxes of 1° × 1° and vertical layers of 5.5 hPa. The resulting vertical profile of CGT shows a characteristic sickle shape, indicating low CGTs close to homogeneous freezing in the upper troposphere and significantly higher values in the midtroposphere. In winter, a pronounced land-sea contrast is found at midlatitudes, with lower CGTs over land. Among this and previous studies, there is disagreement on the sign of the land-sea difference in CGT, suggesting that it is strongly sensitive to the detected and analyzed cloud types, the time of the day, and the phase retrieval method.

  7. Connecting game theory and evolutionary network control for the computational control of soccer matches

    Directory of Open Access Journals (Sweden)

    Alessandro Ferrarini

    2015-03-01

    Full Text Available Game theory, also known as interactive decision theory, is an umbrella term for the logical side of decision science, including both human and non-human events. In this paper a new game theory model is introduced in order to tame complex human events like soccer matches. Soccer-Decoder is a math algorithm recently introduced in order to simulate soccer matches by merging together 3 scientific methods: game theory, differential calculus and stochastic simulations. The philosophy behind Soccer-Decoder is that even very complex real world events, when turned into their irreducible essence, can be understood and predicted. In this work, Soccer-Decoder is combined with Evolutionary Network Control in order to provide a proficient tool to decide the most proper game strategies for determining winning strategies in soccer events. An illustrative example is given. The ratio behind this work is that even very complex real world events can be simulated and then controlled when using appropriate scientific tools.

  8. A Crafts-Oriented Approach to Computing in High School: Introducing Computational Concepts, Practices, and Perspectives with Electronic Textiles

    Science.gov (United States)

    Kafai, Yasmin B.; Lee, Eunkyoung; Searle, Kristin; Fields, Deborah; Kaplan, Eliot; Lui, Debora

    2014-01-01

    In this article, we examine the use of electronic textiles (e-textiles) for introducing key computational concepts and practices while broadening perceptions about computing. The starting point of our work was the design and implementation of a curriculum module using the LilyPad Arduino in a pre-AP high school computer science class. To…

  9. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-06-08

    ... COMMISSION Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... devices, including mobile phones and tablet computers, and components thereof by reason of infringement of... certain electronics devices, including mobile phones and tablet computers, and components thereof that...

  10. Electronic design automation of analog ICs combining gradient models with multi-objective evolutionary algorithms

    CERN Document Server

    Rocha, Frederico AE; Lourenço, Nuno CC; Horta, Nuno CG

    2013-01-01

    This book applies to the scientific area of electronic design automation (EDA) and addresses the automatic sizing of analog integrated circuits (ICs). Particularly, this book presents an approach to enhance a state-of-the-art layout-aware circuit-level optimizer (GENOM-POF), by embedding statistical knowledge from an automatically generated gradient model into the multi-objective multi-constraint optimization kernel based on the NSGA-II algorithm. The results showed allow the designer to explore the different trade-offs of the solution space, both through the achieved device sizes, or the resp

  11. Genetic characterization and evolutionary inference of TNF-α through computational analysis

    Directory of Open Access Journals (Sweden)

    Gauri Awasthi

    Full Text Available TNF-α is an important human cytokine that imparts dualism in malaria pathogenicity. At high dosages, TNF-α is believed to provoke pathogenicity in cerebral malaria; while at lower dosages TNF-α is protective against severe human malaria. In order to understand the human TNF-α gene and to ascertain evolutionary aspects of its dualistic nature for malaria pathogenicity, we characterized this gene in detail in six different mammalian taxa. The avian taxon, Gallus gallus was included in our study, as TNF-α is not present in birds; therefore, a tandemly placed duplicate of TNF-α (LT-α or TNF-β was included. A comparative study was made of nucleotide length variations, intron and exon sizes and number variations, differential compositions of coding to non-coding bases, etc., to look for similarities/dissimilarities in the TNF-α gene across all seven taxa. A phylogenetic analysis revealed the pattern found in other genes, as humans, chimpanzees and rhesus monkeys were placed in a single clade, and rats and mice in another; the chicken was in a clearly separate branch. We further focused on these three taxa and aligned the amino acid sequences; there were small differences between humans and chimpanzees; both were more different from the rhesus monkey. Further, comparison of coding and non-coding nucleotide length variations and coding to non-coding nucleotide ratio between TNF-α and TNF-β among these three mammalian taxa provided a first-hand indication of the role of the TNF-α gene, but not of TNF-β in the dualistic nature of TNF-α in malaria pathogenicity.

  12. Covariance Matrix Adaptation Evolutionary Strategy for Drift Correction of Electronic Nose Data

    Science.gov (United States)

    Di Carlo, S.; Falasconi, M.; Sanchez, E.; Sberveglieri, G.; Scionti, A.; Squillero, G.; Tonda, A.

    2011-09-01

    Electronic Noses (ENs) might represent a simple, fast, high sample throughput and economic alternative to conventional analytical instruments [1]. However, gas sensors drift still limits the EN adoption in real industrial setups due to high recalibration effort and cost [2]. In fact, pattern recognition (PaRC) models built in the training phase become useless after a period of time, in some cases a few weeks. Although algorithms to mitigate the drift date back to the early 90 this is still a challenging issue for the chemical sensor community [3]. Among other approaches, adaptive drift correction methods adjust the PaRC model in parallel with data acquisition without need of periodic calibration. Self-Organizing Maps (SOMs) [4] and Adaptive Resonance Theory (ART) networks [5] have been already tested in the past with fair success. This paper presents and discusses an original methodology based on a Covariance Matrix Adaptation Evolution Strategy (CMA-ES) [6], suited for stochastic optimization of complex problems.

  13. An Evolutionary Computing Enriched RS Attack Resilient Medical Image Steganography Model for Telemedicine Applications

    OpenAIRE

    Mansour, Romany F.; Abdelrahim, Elsaid MD.

    2017-01-01

    The recent advancement in computing technologies and resulting vision based applications have gives rise to a novel practice called telemedicine that requires patient diagnosis images or allied information to recommend or even perform diagnosis practices being located remotely. However, to ensure accurate and optimal telemedicine there is the requirement of seamless or flawless biomedical information about patient. On the contrary, medical data transmitted over insecure channel often remains ...

  14. Estimation of the elastic parameters of human liver biomechanical models by means of medical images and evolutionary computation.

    Science.gov (United States)

    Martínez-Martínez, F; Rupérez, M J; Martín-Guerrero, J D; Monserrat, C; Lago, M A; Pareja, E; Brugger, S; López-Andújar, R

    2013-09-01

    This paper presents a method to computationally estimate the elastic parameters of two biomechanical models proposed for the human liver. The method is aimed at avoiding the invasive measurement of its mechanical response. The chosen models are a second order Mooney-Rivlin model and an Ogden model. A novel error function, the geometric similarity function (GSF), is formulated using similarity coefficients widely applied in the field of medical imaging (Jaccard coefficient and Hausdorff coefficient). This function is used to compare two 3D images. One of them corresponds to a reference deformation carried out over a finite element (FE) mesh of a human liver from a computer tomography image, whilst the other one corresponds to the FE simulation of that deformation in which variations in the values of the model parameters are introduced. Several search strategies, based on GSF as cost function, are developed to accurately find the elastics parameters of the models, namely: two evolutionary algorithms (scatter search and genetic algorithm) and an iterative local optimization. The results show that GSF is a very appropriate function to estimate the elastic parameters of the biomechanical models since the mean of the relative mean absolute errors committed by the three algorithms is lower than 4%. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Computational model for analyzing the evolutionary patterns of the neuraminidase gene of influenza A/H1N1.

    Science.gov (United States)

    Ahn, Insung; Son, Hyeon Seok

    2012-02-01

    In this study, we performed computer simulations to evaluate the changes of selection potentials of codons in influenza A/H1N1 from 1999 to 2009. We artificially generated the sequences by using the transition matrices of positively selected codons over time, and their similarities against the database of influenzavirus A genus were determined by BLAST search. This is the first approach to predict the evolutionary direction of influenza A virus (H1N1) by simulating the codon substitutions over time. We observed that the BLAST results showed the high similarities with pandemic influenza A/H1N1 in 2009, suggesting that the classical human-origin influenza A/H1N1 isolated before 2009 might contain some selection potentials of swine-origin viruses. Computer simulations using the time series codon substitution patterns resulted dramatic changes of BLAST results in influenza A/H1N1, providing a possibility of developing a method for predicting the viral evolution in silico. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Computer simulation of electronic excitation in atomic collision cascades

    Energy Technology Data Exchange (ETDEWEB)

    Duvenbeck, A.

    2007-04-05

    The impact of an keV atomic particle onto a solid surface initiates a complex sequence of collisions among target atoms in a near-surface region. The temporal and spatial evolution of this atomic collision cascade leads to the emission of particles from the surface - a process usually called sputtering. In modern surface analysis the so called SIMS technology uses the flux of sputtered particles as a source of information on the microscopical stoichiometric structure in the proximity of the bombarded surface spots. By laterally varying the bombarding spot on the surface, the entire target can be scanned and chemically analyzed. However, the particle detection, which bases upon deflection in electric fields, is limited to those species that leave the surface in an ionized state. Due to the fact that the ionized fraction of the total flux of sputtered atoms often only amounts to a few percent or even less, the detection is often hampered by rather low signals. Moreover, it is well known, that the ionization probability of emitted particles does not only depend on the elementary species, but also on the local environment from which a particle leaves the surface. Therefore, the measured signals for different sputtered species do not necessarily represent the stoichiometric composition of the sample. In the literature, this phenomenon is known as the Matrix Effect in SIMS. In order to circumvent this principal shortcoming of SIMS, the present thesis develops an alternative computer simulation concept, which treats the electronic energy losses of all moving atoms as excitation sources feeding energy into the electronic sub-system of the solid. The particle kinetics determining the excitation sources are delivered by classical molecular dynamics. The excitation energy calculations are combined with a diffusive transport model to describe the spread of excitation energy from the initial point of generation. Calculation results yield a space- and time-resolved excitation

  17. 77 FR 27078 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-05-08

    ... Trade Commission has received a complaint entitled Certain Electronic Devices, Including Mobile Phones... From the Federal Register Online via the Government Publishing Office INTERNATIONAL TRADE COMMISSION Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

  18. An exploration of computer-simulated evolution and small group discussion on pre-service science teachers' perceptions of evolutionary concepts

    Science.gov (United States)

    MacDonald, Ronald Douglas

    The primary goal of this study was to explore how the use of a computer simulation of basic evolutionary processes, in combination with small-group discussions, affected Intermediate/Senior pre-service science teachers' perspectives of basic evolutionary concepts. Qualitative and quantitative methods were used in a case study approach with 19 pre-service Intermediate/Senior science teachers at an Ontario university. Several sub-goals were explored. The first sub-goal was to assess Intermediate/Senior pre-service science teachers' current conceptions of evolution. The results indicated that approximately two-thirds of the participants had a poor understanding of basic evolutionary concepts, with only 2 of the 19 participants demonstrating a strong comprehension. These results were found to be very similar to comparable samples of subjects from other research. The second sub-goal was to explore the relationships among Intermediate/Senior pre-service science teachers' understanding of contemporary evolutionary concepts, their perspectives of the nature of science, and their intentions to teach evolutionary concepts in the classroom. Participants' knowledge of evolutionary concepts was found to be associated strongly with their intentions to teach evolution by natural selection (r = .42). However, knowledge of evolutionary concepts was not found to be associated with any particular science epistemology perspective. The third sub-goal was to analyze and to interpret the small-group discussions as members interacted with the simulation. The simulation was found to be highly engaging and a very effective method of encouraging participants to speculate, question, discuss and learn about important evolutionary concepts. Analyses of the discussions revealed that the simulation evoked a wide array of correct conceptions as well as misconceptions. The fourth sub-goal was to assess the extent to which creating a lesson plan on the topic of natural selection could affect

  19. Phase II Final Report Computer Optimization of Electron Guns

    Energy Technology Data Exchange (ETDEWEB)

    R. Lawrence Ives; Thuc Bui; Hien Tran; Michael Read; Adam Attarian; William Tallis

    2011-04-15

    This program implemented advanced computer optimization into an adaptive mesh, finite element, 3D, charged particle code. The routines can optimize electron gun performance to achieve a specified current, beam size, and perveance. It can also minimize beam ripple and electric field gradients. The magnetics optimization capability allows design of coil geometries and magnetic material configurations to achieve a specified axial magnetic field profile. The optimization control program, built into the charged particle code Beam Optics Analyzer (BOA) utilizes a 3D solid modeling package to modify geometry using design tables. Parameters within the graphical user interface (currents, voltages, etc.) can be directly modified within BOA. The program implemented advanced post processing capability for the optimization routines as well as the user. A Graphical User Interface allows the user to set up goal functions, select variables, establish ranges of variation, and define performance criteria. The optimization capability allowed development of a doubly convergent multiple beam gun that could not be designed using previous techniques.

  20. On the Computation of Secondary Electron Emission Models

    OpenAIRE

    Clerc, Sebastien; Dennison, JR; Hoffmann, Ryan; Abbott, Jonathon

    2006-01-01

    Secondary electron emission is a critical contributor to the charge particle current balance in spacecraft charging. Spacecraft charging simulation codes use a parameterized expression for the secondary electron (SE) yield delta(Eo) as a function of the incident electron energy Eo. Simple three-step physics models of the electron penetration, transport, and emission from a solid are typically expressed in terms of the incident electron penetration depth at normal incidence R(Eo) and the mean ...

  1. Computer optimization techniques for NASA Langley's CSI evolutionary model's real-time control system

    Science.gov (United States)

    Elliott, Kenny B.; Ugoletti, Roberto; Sulla, Jeff

    1992-01-01

    The evolution and optimization of a real-time digital control system is presented. The control system is part of a testbed used to perform focused technology research on the interactions of spacecraft platform and instrument controllers with the flexible-body dynamics of the platform and platform appendages. The control system consists of Computer Automated Measurement and Control (CAMAC) standard data acquisition equipment interfaced to a workstation computer. The goal of this work is to optimize the control system's performance to support controls research using controllers with up to 50 states and frame rates above 200 Hz. The original system could support a 16-state controller operating at a rate of 150 Hz. By using simple yet effective software improvements, Input/Output (I/O) latencies and contention problems are reduced or eliminated in the control system. The final configuration can support a 16-state controller operating at 475 Hz. Effectively the control system's performance was increased by a factor of 3.

  2. Evolutionary and Disruptive Approaches for Designing Next-Generation Ultra Energy-Efficient Electronics

    Science.gov (United States)

    Dadgour, Hamed F.

    analyzed. It is shown that such devices can be employed to implement highly energy-efficient and ultra compact XOR gates, which are the key building blocks for more complex computational units. The lateral NEMS device also creates new opportunities in Boolean logic minimization and seems promising for implementing high-performance arithmetic modules (such as Adders). A comprehensive scaling analysis of the NEMS devices is also conducted to identify the key challenges that must be overcome before such transistors can be incorporated in the mainstream IC technologies.

  3. Electron wave collimation by conical horns : computer simulation

    NARCIS (Netherlands)

    Michielsen, K.; de Raedt, H.

    1991-01-01

    Results are presented of extensive numerical simulations of electron wave packets transmitted by horns. A detailed quantitative analysis is given of the collimation of the electron wave by horn-like devices. It is demonstrated that the electron wave collimation effect cannot be described in terms of

  4. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  5. Hybrid pattern recognition method using evolutionary computing techniques applied to the exploitation of hyperspectral imagery and medical spectral data

    Science.gov (United States)

    Burman, Jerry A.

    1999-12-01

    Hyperspectral image sets are three dimensional data volumes that are difficult to exploit by manual means because they are comprised of multiple bands of image data that are not easily visualized or assessed. GTE Government Systems Corporation has developed a system that utilizes Evolutionary Computing techniques to automatically identify materials in terrain hyperspectral imagery. The system employs sophisticated signature preprocessing and a unique combination of non- parametric search algorithms guided by a model based cost function to achieve rapid convergence and pattern recognition. The system is scaleable and is capable of discriminating and identifying pertinent materials that comprise a specific object of interest in the terrain and estimating the percentage of materials present within a pixel of interest (spectral unmixing). The method has been applied and evaluated against real hyperspectral imagery data from the AVIRIS sensor. In addition, the process has been applied to remotely sensed infrared spectra collected at the microscopic level to assess the amounts of DNA, RNA and protein present in human tissue samples as an aid to the early detection of cancer.

  6. An evolutionary computation based algorithm for calculating solar differential rotation by automatic tracking of coronal bright points

    Science.gov (United States)

    Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.

    2016-03-01

    Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.

  7. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... COMMISSION Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in... issuing for public comment draft regulatory guide (DG), DG-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used in Safety Systems of Nuclear Power Plants.'' The DG...

  8. Community Needs Assessment for an Electronics and Computer Engineering Technology Program at Maui, Molokai, and Lanai.

    Science.gov (United States)

    Pezzoli, Jean A.

    In June 1992, Maui Community College (MCC), in Hawaii, conducted a survey of the communities of Maui, Molokai, Lanai, and Hana to determine perceived needs for an associate degree and certificate program in electronics and computer engineering. Questionnaires were mailed to 500 firms utilizing electronic or computer services, seeking information…

  9. Application of advanced electronics to a future spacecraft computer design

    Science.gov (United States)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  10. Evolutionary humanoid robotics

    CERN Document Server

    Eaton, Malachy

    2015-01-01

    This book examines how two distinct strands of research on autonomous robots, evolutionary robotics and humanoid robot research, are converging. The book will be valuable for researchers and postgraduate students working in the areas of evolutionary robotics and bio-inspired computing.

  11. Computer Aided Design Tools for Extreme Environment Electronics Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  12. Computational Nanotechnology of Molecular Materials, Electronics and Machines

    Science.gov (United States)

    Srivastava, D.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    This viewgraph presentation covers carbon nanotubes, their characteristics, and their potential future applications. The presentation include predictions on the development of nanostructures and their applications, the thermal characteristics of carbon nanotubes, mechano-chemical effects upon carbon nanotubes, molecular electronics, and models for possible future nanostructure devices. The presentation also proposes a neural model for signal processing.

  13. 76 FR 22918 - In the Matter of Certain Handheld Electronic Computing Devices, Related Software, and Components...

    Science.gov (United States)

    2011-04-25

    ... From the Federal Register Online via the Government Publishing Office ] INTERNATIONAL TRADE COMMISSION In the Matter of Certain Handheld Electronic Computing Devices, Related Software, and Components Thereof; Notice of Investigation AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY...

  14. International Conference on Emerging Research in Electronics, Computer Science and Technology

    CERN Document Server

    Sheshadri, Holalu; Padma, M

    2014-01-01

    PES College of Engineering is organizing an International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT-12) in Mandya and merging the event with Golden Jubilee of the Institute. The Proceedings of the Conference presents high quality, peer reviewed articles from the field of Electronics, Computer Science and Technology. The book is a compilation of research papers from the cutting-edge technologies and it is targeted towards the scientific community actively involved in research activities.

  15. Statistical analysis and definition of blockages-prediction formulae for the wastewater network of Oslo by evolutionary computing.

    Science.gov (United States)

    Ugarelli, Rita; Kristensen, Stig Morten; Røstum, Jon; Saegrov, Sveinung; Di Federico, Vittorio

    2009-01-01

    Oslo Vann og Avløpsetaten (Oslo VAV)-the water/wastewater utility in the Norwegian capital city of Oslo-is assessing future strategies for selection of most reliable materials for wastewater networks, taking into account not only material technical performance but also material performance, regarding operational condition of the system.The research project undertaken by SINTEF Group, the largest research organisation in Scandinavia, NTNU (Norges Teknisk-Naturvitenskapelige Universitet) and Oslo VAV adopts several approaches to understand reasons for failures that may impact flow capacity, by analysing historical data for blockages in Oslo.The aim of the study was to understand whether there is a relationship between the performance of the pipeline and a number of specific attributes such as age, material, diameter, to name a few. This paper presents the characteristics of the data set available and discusses the results obtained by performing two different approaches: a traditional statistical analysis by segregating the pipes into classes, each of which with the same explanatory variables, and a Evolutionary Polynomial Regression model (EPR), developed by Technical University of Bari and University of Exeter, to identify possible influence of pipe's attributes on the total amount of predicted blockages in a period of time.Starting from a detailed analysis of the available data for the blockage events, the most important variables are identified and a classification scheme is adopted.From the statistical analysis, it can be stated that age, size and function do seem to have a marked influence on the proneness of a pipeline to blockages, but, for the reduced sample available, it is difficult to say which variable it is more influencing. If we look at total number of blockages the oldest class seems to be the most prone to blockages, but looking at blockage rates (number of blockages per km per year), then it is the youngest class showing the highest blockage rate

  16. Computational estimation of the gain image of Direct Electron Detectors

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez Gimenez, E.; Peredo Robinson, V.; Sorzano, C.O.S.; Vargas, J.; Oton, J.; Vilas, J.L.; Rosa-Trevin, J.L. de la; Melero, R.; Gomez-Blanco, J.; Cuenca, J.; Cano, L. del; Conesa, P.; Marabini, R.; Carazo, J.M.

    2016-07-01

    The introduction of Direct Electron Detectors (DED) in the Electron Microscope field has boosted Single Particle Analysis to a point in which it is currently considered to be a key player technique in Structural Biology. In this article we address the issues of increasing the quality of current DED images as well as their ease of use. In this way, we introduce an algorithm to estimate the camera gain at each pixel from the movies themselves, so that the recorded movies can be compensated for differences amongst the detection capability of the camera sensors. This compensation is needed to set the recorded frames in a coherent gray level range, homogeneous over the whole image. The algorithm does not need any other input than the DED movie itself and it is able of estimating the camera gain image, identifying dead pixels and incorrectly calibrated cameras. We show the results for the three current DED camera models (DE, Falcon and K2). (Author)

  17. Computer Mediated Communication and the Emergence of "Electronic Opportunism"

    OpenAIRE

    Rocco, Elena; Warglien, Massimo

    1996-01-01

    An experiment on how communication affects cooperation in a social dilemma shows that computer mediated communication (CMC) and face to face communication have markedly different effects on patterns of collective behavior. While face to face communication sustains stable cooperation, CMC makes cooperative agreements in groups extremely fragile, giving rise to waves of opportunistic behavior. Further analysis of communication protocols highlights that the breakdown of ordinary communication ru...

  18. Information Technology in project-organized electronic and computer technology engineering education

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    1999-01-01

    This paper describes the integration of IT in the education of electronic and computer technology engineers at Institute of Electronic Systems, Aalborg Uni-versity, Denmark. At the Institute Information Technology is an important tool in the aspects of the education as well as for communication...

  19. Effectiveness of an Electronic Performance Support System on Computer Ethics and Ethical Decision-Making Education

    Science.gov (United States)

    Kert, Serhat Bahadir; Uz, Cigdem; Gecu, Zeynep

    2014-01-01

    This study examined the effectiveness of an electronic performance support system (EPSS) on computer ethics education and the ethical decision-making processes. There were five different phases to this ten month study: (1) Writing computer ethics scenarios, (2) Designing a decision-making framework (3) Developing EPSS software (4) Using EPSS in a…

  20. Computing the Effects of Strain on Electronic States: A Survey of Methods and Issues

    Science.gov (United States)

    2012-12-01

    Handbook of Electronic and Photonic Materials ; Kasap , S., Capper, P., Koughia, C., Eds.; Springer Science: New York, 2006; pp 887–916. 98. Pryor, C...cursory survey of existing computational methods and known challenges in estimating accurate electronic band structures of materials suited for... electronic device concepts. Both ab initio and empirical approaches are described, but greater emphasis is on atomistic and continuum-based empirical

  1. Computed bound and continuum electronic states of the nitrogen molecule

    Directory of Open Access Journals (Sweden)

    Tennyson Jonathan

    2015-01-01

    Full Text Available The dissociative recombination (DR of N2+ is important for processes occurring in our atmosphere. However, it is not particularly well characterised, experimentally for the vibrational ground state and, theoretically for the v ≥ 4. We use the R-matrix method to compute potential energy curves for both the bound Rydberg states of nitrogen and for quasi-bound states lying in the continuum. Use of a fine mesh of internuclear separations allows the details of avoided crossings to be determined. The prospects for using the curves as the input for DR calculations is discussed.

  2. Curious matrix effects: a computational, electron diffraction, and vibrational spectroscopic study of dysprosium triiodide.

    Science.gov (United States)

    Varga, Zoltán; Groen, Cornelis Petrus; Kolonits, Mária; Hargittai, Magdolna

    2010-07-21

    The molecular and electronic structure of dysprosium triiodide, DyI(3), and its dimer, Dy(2)I(6), was determined by high level computations, gas-phase electron diffraction, and gas-phase infrared and matrix-isolation infrared and Raman spectroscopy. The free monomeric molecule is planar from all methods with an equilibrium bond length of 2.808(9) A; the thermal average bond length from electron diffraction is 2.828(6) A. The molecule forms complexes in the matrix-isolation experiments causing pyramidalisation of planar monomeric molecules. The likelihood of having both pyramidal and planar DyI(3) molecules in the matrix is discussed in order to explain certain features of the spectra. Our computations suggest that the dimer geometry depends on the occupation of the partially filled 4f orbitals. As this is the third molecule in the dysprosium trihalide series studied, trends in their electronic and molecular structures are presented and discussed.

  3. Evaluating Electronic Customer Relationship Management Performance: Case Studies from Persian Automotive and Computer Industry

    OpenAIRE

    Safari, Narges; Safari, Fariba; Olesen, Karin; Shahmehr, Fatemeh

    2016-01-01

    This research paper investigates the influence of industry on electronic customer relationship management (e-CRM) performance. A case study approach with two cases was applied to evaluate the influence of e-CRM on customer behavioral and attitudinal loyalty along with customer pyramid. The cases covered two industries consisting of computer and automotive industries. For investigating customer behavioral loyalty and customer pyramid companies database were computed while for examining custome...

  4. Electron recombination in ionized liquid argon: a computational approach based on realistic models of electron transport and reactions.

    Science.gov (United States)

    Jaskolski, Michal; Wojcik, Mariusz

    2011-05-05

    In this work, we propose a new theoretical approach to modeling the electron-ion recombination processes in ionization tracks in liquid argon at 87 K. We developed a computer simulation method using realistic models of charge transport and electron-ion reactions. By introducing the concept of one-dimensional periodicity in the track, we are able to model very large cylindrical structures of charged particles. We apply our simulation method to calculate the electron escape probability as a function of the initial ionization density in the track. The results are in quantitative agreement with experiment for radiation tracks of relatively high ionization density. At low ionization densities, the simulation results slightly overestimate the experimental data. We discuss possible reasons for this disagreement and conclude that it can be explained by the role of δ tracks (short tracks of secondary electrons) in electron-ion recombination processes. We introduce an approximate model that takes into account the presence of δ tracks and allows the experimental data obtained from a liquid-argon ionization detector to be reproduced over a wide range of ionization density.

  5. Quantum computers based on electron spins controlled by ultrafast off-resonant single optical pulses.

    Science.gov (United States)

    Clark, Susan M; Fu, Kai-Mei C; Ladd, Thaddeus D; Yamamoto, Yoshihisa

    2007-07-27

    We describe a fast quantum computer based on optically controlled electron spins in charged quantum dots that are coupled to microcavities. This scheme uses broadband optical pulses to rotate electron spins and provide the clock signal to the system. Nonlocal two-qubit gates are performed by phase shifts induced by electron spins on laser pulses propagating along a shared waveguide. Numerical simulations of this scheme demonstrate high-fidelity single-qubit and two-qubit gates with operation times comparable to the inverse Zeeman frequency.

  6. Computation and analysis of the electron transport properties for nitrogen and air inductively-coupled plasmas

    Science.gov (United States)

    Yu, Minghao; Kihara, Hisashi; Abe, Ken-ichi; Takahashi, Yusuke

    2015-06-01

    A relatively simple method for calculating accurately the third-order electron transport properties of nitrogen and air thermal plasmas is presented. The electron transport properties, such as the electrical conductivity and the electron thermal conductivity, were computed with the best and latest available collision cross-section data in the temperature and pressure ranges of T = 300 - 15000 K and p = 0.01 - 1.0 atm, respectively. The results obtained under the atmospheric pressure condition showed good agreements with the experimental and the high-accuracy theoretical results. The presently-introduced method has good application potential in numerical simulations of nitrogen and air inductively-coupled plasmas.

  7. REEFER: a digital computer program for the simulation of high energy electron tubes. [Reefer

    Energy Technology Data Exchange (ETDEWEB)

    Boers, J.E.

    1976-11-01

    A digital computer program for the simulation of very high-energy electron and ion beams is described. The program includes space-charge effects through the solution of Poisson's equation and magnetic effects (both induced and applied) through the relativistic trajectory equations. Relaxation techniques are employed while alternately computing electric fields and trajectories. Execution time is generally less than 15 minutes on a CDC 6600 digital computer. Either space-charge-limited or field-emission sources may be simulated. The input data is described in detail and an example data set is included.

  8. A computationally assisted spectroscopic technique to measure secondary electron emission coefficients in radio frequency plasmas

    CERN Document Server

    Daksha, M; Schuengel, E; Korolov, I; Derzsi, A; Koepke, M; Donko, Z; Schulze, J

    2016-01-01

    A Computationally Assisted Spectroscopic Technique to measure secondary electron emission coefficients ($\\gamma$-CAST) in capacitively-coupled radio-frequency plasmas is proposed. This non-intrusive, sensitive diagnostic is based on a combination of Phase Resolved Optical Emission Spectroscopy and particle-based kinetic simulations. In such plasmas (under most conditions in electropositive gases) the spatio-temporally resolved electron-impact excitation/ionization rate features two distinct maxima adjacent to each electrode at different times within each RF period. While one maximum is the consequence of the energy gain of electrons due to sheath expansion, the second maximum is produced by secondary electrons accelerated towards the plasma bulk by the sheath electric field at the time of maximum voltage drop across the adjacent sheath. Due to these different excitation/ionization mechanisms, the ratio of the intensities of these maxima is very sensitive to the secondary electron emission coefficient $\\gamma$...

  9. The Pharmaco –, Population and Evolutionary Dynamics of Multi-drug Therapy: Experiments with S. aureus and E. coli and Computer Simulations

    Science.gov (United States)

    Ankomah, Peter; Johnson, Paul J. T.; Levin, Bruce R.

    2013-01-01

    There are both pharmacodynamic and evolutionary reasons to use multiple rather than single antibiotics to treat bacterial infections; in combination antibiotics can be more effective in killing target bacteria as well as in preventing the emergence of resistance. Nevertheless, with few exceptions like tuberculosis, combination therapy is rarely used for bacterial infections. One reason for this is a relative dearth of the pharmaco-, population- and evolutionary dynamic information needed for the rational design of multi-drug treatment protocols. Here, we use in vitro pharmacodynamic experiments, mathematical models and computer simulations to explore the relative efficacies of different two-drug regimens in clearing bacterial infections and the conditions under which multi-drug therapy will prevent the ascent of resistance. We estimate the parameters and explore the fit of Hill functions to compare the pharmacodynamics of antibiotics of four different classes individually and in pairs during cidal experiments with pathogenic strains of Staphylococcus aureus and Escherichia coli. We also consider the relative efficacy of these antibiotics and antibiotic pairs in reducing the level of phenotypically resistant but genetically susceptible, persister, subpopulations. Our results provide compelling support for the proposition that the nature and form of the interactions between drugs of different classes, synergy, antagonism, suppression and additivity, has to be determined empirically and cannot be inferred from what is known about the pharmacodynamics or mode of action of these drugs individually. Monte Carlo simulations of within-host treatment incorporating these pharmacodynamic results and clinically relevant refuge subpopulations of bacteria indicate that: (i) the form of drug-drug interactions can profoundly affect the rate at which infections are cleared, (ii) two-drug therapy can prevent treatment failure even when bacteria resistant to single drugs are present

  10. Solving Human Performance Problems with Computers. A Case Study: Building an Electronic Performance Support System.

    Science.gov (United States)

    Raybould, Barry

    1990-01-01

    Describes the design of an electronic performance support system (PSS) that was developed to help sales and support personnel access relevant information needed for good job performance. Highlights include expert systems, databases, interactive video discs, formatting information online, information retrieval techniques, HyperCard, computer-based…

  11. COMPUTATIONAL ELECTROCHEMISTRY: AQUEOUS ONE-ELECTRON OXIDATION POTENTIALS FOR SUBSTITUTED ANILINES

    Science.gov (United States)

    Semiempirical molecular orbital theory and density functional theory are used to compute one-electron oxidation potentials for aniline and a set of 21 mono- and di-substituted anilines in aqueous solution. Linear relationships between theoretical predictions and experiment are co...

  12. Psychiatrists’ Comfort Using Computers and Other Electronic Devices in Clinical Practice

    Science.gov (United States)

    Fochtmann, Laura J.; Clarke, Diana E.; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K.; Plovnick, Robert M.

    2015-01-01

    This report highlights findings from the Study of Psychiatrists’ Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists’ comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted. PMID:26667248

  13. An approach to first principles electronic structure calculation by symbolic-numeric computation

    Directory of Open Access Journals (Sweden)

    Akihito Kikuchi

    2013-04-01

    Full Text Available There is a wide variety of electronic structure calculation cooperating with symbolic computation. The main purpose of the latter is to play an auxiliary role (but not without importance to the former. In the field of quantum physics [1-9], researchers sometimes have to handle complicated mathematical expressions, whose derivation seems almost beyond human power. Thus one resorts to the intensive use of computers, namely, symbolic computation [10-16]. Examples of this can be seen in various topics: atomic energy levels, molecular dynamics, molecular energy and spectra, collision and scattering, lattice spin models and so on [16]. How to obtain molecular integrals analytically or how to manipulate complex formulas in many body interactions, is one such problem. In the former, when one uses special atomic basis for a specific purpose, to express the integrals by the combination of already known analytic functions, may sometimes be very difficult. In the latter, one must rearrange a number of creation and annihilation operators in a suitable order and calculate the analytical expectation value. It is usual that a quantitative and massive computation follows a symbolic one; for the convenience of the numerical computation, it is necessary to reduce a complicated analytic expression into a tractable and computable form. This is the main motive for the introduction of the symbolic computation as a forerunner of the numerical one and their collaboration has won considerable successes. The present work should be classified as one such trial. Meanwhile, the use of symbolic computation in the present work is not limited to indirect and auxiliary part to the numerical computation. The present work can be applicable to a direct and quantitative estimation of the electronic structure, skipping conventional computational methods.

  14. Evolutionary Nephrology.

    Science.gov (United States)

    Chevalier, Robert L

    2017-05-01

    Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as "maladaptive." In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic) adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ~40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons), evolutionary selection for APOL1 mutations (that provide resistance to trypanosome infection, a tradeoff), and modern life experience (Western diet mismatch leading to diabetes and hypertension). Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo), developmental programming and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  15. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  16. Complex of codes for computer-aided design of electron guns with grid control

    Science.gov (United States)

    Petrosyan, A. I.; Juravleva, V. D.; Penzyakov, V. V.; Rogovin, V. I.

    2003-03-01

    The computer-aided design of gridded electron guns on a basis of two, two and half and 3D mathematical models is carried out. The parameters of cathode-grid assembly (CGA) are calculated, synthesis and the trajectories analysis of ungridded gun, in which a control grid is supposed to be established, are realized. The trajectories analysis of a beam in CGA cells is fulfilled, the coordinates, velocites and charges of large particles on an exit from CGA cells are used for definition of the initial data for solving beam motion equations in the gridded electron gun and further in magnetic focusing system. Electron-optical systems with gridded electron gun, designed with the help of this complex of codes, do not require experimental correction and provide good current permeability in microwave tubes.

  17. Reconciliation of the cloud computing model with US federal electronic health record regulations.

    Science.gov (United States)

    Schweitzer, Eugene J

    2012-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.

  18. Reconciliation of the cloud computing model with US federal electronic health record regulations

    Science.gov (United States)

    2011-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204

  19. Evaluating 99mTc Auger electrons for targeted tumor radiotherapy by computational methods.

    Science.gov (United States)

    Tavares, Adriana Alexandre S; Tavares, João Manuel R S

    2010-07-01

    Technetium-99m (99mTc) has been widely used as an imaging agent but only recently has been considered for therapeutic applications. This study aims to analyze the potential use of 99mTc Auger electrons for targeted tumor radiotherapy by evaluating the DNA damage and its probability of correct repair and by studying the cellular kinetics, following 99mTc Auger electron irradiation in comparison to iodine-131 (131I) beta minus particles and astatine-211 (211At) alpha particle irradiation. Computational models were used to estimate the yield of DNA damage (fast Monte Carlo damage algorithm), the probability of correct repair (Monte Carlo excision repair algorithm), and cell kinetic effects (virtual cell radiobiology algorithm) after irradiation with the selected particles. The results obtained with the algorithms used suggested that 99mTc CKMMX (all M-shell Coster-Kroning--CK--and super-CK transitions) electrons and Auger MXY (all M-shell Auger transitions) have a therapeutic potential comparable to high linear energy transfer 211At alpha particles and higher than 131I beta minus particles. All the other 99mTc electrons had a therapeutic potential similar to 131I beta minus particles. 99mTc CKMMX electrons and Auger MXY presented a higher probability to induce apoptosis than 131I beta minus particles and a probability similar to 211At alpha particles. Based on the results here, 99mTc CKMMX electrons and Auger MXY are useful electrons for targeted tumor radiotherapy.

  20. From Order to Disorder: The Role of Computer-Based Electronics Projects on Fostering of Higher-Order Cognitive Skills

    Science.gov (United States)

    Barak, Moshe

    2005-01-01

    This research explored learning and thinking processes enhanced by integrating computers in secondary schools electronics projects. Electronics studies provide a sophisticated learning environment, where computers are simultaneously part of the subject matter learned (Technology Education), and a means for enhancing teaching and learning…

  1. Optimal Mixing Evolutionary Algorithms

    NARCIS (Netherlands)

    D. Thierens (Dirk); P.A.N. Bosman (Peter); N. Krasnogor

    2011-01-01

    htmlabstractA key search mechanism in Evolutionary Algorithms is the mixing or juxtaposing of partial solutions present in the parent solutions. In this paper we look at the efficiency of mixing in genetic algorithms (GAs) and estimation-of-distribution algorithms (EDAs). We compute the mixing

  2. Handheld vs. laptop computers for electronic data collection in clinical research: a crossover randomized trial.

    Science.gov (United States)

    Haller, Guy; Haller, Dagmar M; Courvoisier, Delphine S; Lovis, Christian

    2009-01-01

    To compare users' speed, number of entry errors and satisfaction in using two current devices for electronic data collection in clinical research: handheld and laptop computers. The authors performed a randomized cross-over trial using 160 different paper-based questionnaires and representing altogether 45,440 variables. Four data coders were instructed to record, according to a random predefined and equally balanced sequence, the content of these questionnaires either on a laptop or on a handheld computer. Instructions on the kind of device to be used were provided to data-coders in individual sealed and opaque envelopes. Study conditions were controlled and the data entry process performed in a quiet environment. The authors compared the duration of the data recording process, the number of errors and users' satisfaction with the two devices. The authors divided errors into two separate categories, typing and missing data errors. The original paper-based questionnaire was used as a gold-standard. The overall duration of the recording process was significantly reduced (2.0 versus 3.3 min) when data were recorded on the laptop computer (p laptop compared to 8.4 per 1,000 with the handheld computer (p laptop was used (p laptop easier, faster and more satisfying to use than the handheld computer. Despite the increasing use of handheld computers for electronic data collection in clinical research, these devices should be used with caution. They double the duration of the data entry process and significantly increase the risk of typing errors and missing data. This may become a particularly crucial issue in studies where these devices are provided to patients or healthcare workers, unfamiliar with computer technologies, for self-reporting or research data collection processes.

  3. Electron transport parameters in CO$_2$: scanning drift tube measurements and kinetic computations

    CERN Document Server

    Vass, M; Loffhagen, D; Pinhao, N; Donko, Z

    2016-01-01

    This work presents transport coefficients of electrons (bulk drift velocity, longitudinal diffusion coefficient, and effective ionization frequency) in CO2 measured under time-of-flight conditions over a wide range of the reduced electric field, 15Td <= E/N <= 2660Td in a scanning drift tube apparatus. The data obtained in the experiments are also applied to determine the effective steady-state Townsend ionization coefficient. These parameters are compared to the results of previous experimental studies, as well as to results of various kinetic computations: solutions of the electron Boltzmann equation under different approximations (multiterm and density gradient expansions) and Monte Carlo simulations. The experimental data extend the range of E/N compared with previous measurements and are consistent with most of the transport parameters obtained in these earlier studies. The computational results point out the range of applicability of the respective approaches to determine the different measured tr...

  4. Computational analyses of an evolutionary arms race between mammalian immunity mediated by immunoglobulin A and its subversion by bacterial pathogens.

    Directory of Open Access Journals (Sweden)

    Ana Pinheiro

    Full Text Available IgA is the predominant immunoglobulin isotype in mucosal tissues and external secretions, playing important roles both in defense against pathogens and in maintenance of commensal microbiota. Considering the complexity of its interactions with the surrounding environment, IgA is a likely target for diversifying or positive selection. To investigate this possibility, the action of natural selection on IgA was examined in depth with six different methods: CODEML from the PAML package and the SLAC, FEL, REL, MEME and FUBAR methods implemented in the Datamonkey webserver. In considering just primate IgA, these analyses show that diversifying selection targeted five positions of the Cα1 and Cα2 domains of IgA. Extending the analysis to include other mammals identified 18 positively selected sites: ten in Cα1, five in Cα2 and three in Cα3. All but one of these positions display variation in polarity and charge. Their structural locations suggest they indirectly influence the conformation of sites on IgA that are critical for interaction with host IgA receptors and also with proteins produced by mucosal pathogens that prevent their elimination by IgA-mediated effector mechanisms. Demonstrating the plasticity of IgA in the evolution of different groups of mammals, only two of the eighteen selected positions in all mammals are included in the five selected positions in primates. That IgA residues subject to positive selection impact sites targeted both by host receptors and subversive pathogen ligands highlights the evolutionary arms race playing out between mammals and pathogens, and further emphasizes the importance of IgA in protection against mucosal pathogens.

  5. Evolutionary Nephrology

    Directory of Open Access Journals (Sweden)

    Robert L. Chevalier

    2017-05-01

    Full Text Available Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as “maladaptive.” In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or from evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ∼40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons, evolutionary selection for APOL1 mutations (which provide resistance to trypanosome infection, a tradeoff, and modern life experience (Western diet mismatch leading to diabetes and hypertension. Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout the life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo, developmental programming, and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  6. Interaction of 3d transition metal atoms with charged ion projectiles from Electron Nuclear Dynamics computation

    Science.gov (United States)

    Hagelberg, Frank

    2003-03-01

    Computational results on atomic scattering between charged projectiles and transition metal target atoms are presented. This work aims at obtaining detailed information about charge, spin and energy transfer processes that occur between the interacting particles. An in-depth understanding of these phenomena is expected to provide a theoretical basis for the interpretation of various types of ion beam experiments, ranging from gas phase chromatography to spectroscopic observations of fast ions in ferromagnetic media. This contribution focuses on the scattering of light projectiles ranging from He to O, that are prepared in various initial charge states, by 3d transition metal atoms. The presented computations are performed in the framework of Electron Nuclear Dynamics (END)^1 theory which incorporates the coupling between electronic and nuclear degrees of freedom without reliance on the computationally cumbersome and frequently intractable determination of potential energy surfaces. In the present application of END theory to ion - transition metal atom scattering, a supermolecule approach is utilized in conjunction with a spin-unrestricted single determinantal wave function describing the electronic system. Integral scattering, charge and spin exchange cross sections are discussed as functions of the elementary parameters of the problem, such as projectile and target atomic numbers as well as projectile charge and initial kinetic energy. ^1 E.Deumens, A.Diz, R.Longo, Y.Oehrn, Rev.Mod.Phys. 66, 917 (1994)

  7. Maximal thickness of the normal human pericardium assessed by electron-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Delille, J.P.; Hernigou, A.; Sene, V.; Chatellier, G.; Boudeville, J.C.; Challande, P.; Plainfosse, M.C. [Service de Radiologie Centrale, Hopital Broussais, Paris (France)

    1999-08-01

    The purpose of this study was to determine the maximal value of normal pericardial thickness with an electron-beam computed tomography unit allowing fast scan times of 100 ms to reduce cardiac motion artifacts. Electron-beam computed tomography was performed in 260 patients with hypercholesterolemia and/or hypertension, as these pathologies have no effect on pericardial thickness. The pixel size was 0.5 mm. Measurements could be performed in front of the right ventricle, the right atrioventricular groove, the right atrium, the left ventricle, and the interventricular groove. Maximal thickness of normal pericardium was defined at the 95th percentile. Inter-observer and intra-observer reproducibility studies were assessed from additional CT scans by the Bland and Altman method [24]. The maximal thickness of the normal pericardium was 2 mm for 95 % of cases. For the reproducibility studies, there was no significant relationship between the inter-observer and intra-observer measurements, but all pericardial thickness measurements were {<=} 1.6 mm. Using electron-beam computed tomography, which assists in decreasing substantially cardiac motion artifacts, the threshold of detection of thickened pericardium is statistically established as being 2 mm for 95 % of the patients with hypercholesterolemia and/or hypertension. However, the spatial resolution available prevents a reproducible measure of the real thickness of thin pericardium. (orig.) With 6 figs., 1 tab., 31 refs.

  8. Correlation between provider computer experience and accuracy of electronic anesthesia charting A pilot study and performance improvement project

    Science.gov (United States)

    2017-03-20

    Anesthesia recordkeeping: Accuracy of recall with computerized and manual entry recordkeeping. CORRELATION BETWEEN PROVIDER COMPUTER EXPERIENCE 39...Unexpected increased mortality after implementation of a CORRELATION BETWEEN PROVIDER COMPUTER EXPERIENCE 40 commercially sold computerized physician...Correlation between provider computer experience and accuracy of electronic anesthesia charting – A pilot study and performance improvement

  9. Paper- and computer-based workarounds to electronic health record use at three benchmark institutions

    Science.gov (United States)

    Flanagan, Mindy E; Saleem, Jason J; Millitello, Laura G; Russ, Alissa L; Doebbeling, Bradley N

    2013-01-01

    Background Healthcare professionals develop workarounds rather than using electronic health record (EHR) systems. Understanding the reasons for workarounds is important to facilitate user-centered design and alignment between work context and available health information technology tools. Objective To examine both paper- and computer-based workarounds to the use of EHR systems in three benchmark institutions. Methods Qualitative data were collected in 11 primary care outpatient clinics across three healthcare institutions. Data collection methods included direct observation and opportunistic questions. In total, 120 clinic staff and providers and 118 patients were observed. All data were analyzed using previously developed workaround categories and examined for potential new categories. Additionally, workarounds were coded as either paper- or computer-based. Results Findings corresponded to 10 of 11 workaround categories identified in previous research. All 10 of these categories applied to paper-based workarounds; five categories also applied to computer-based workarounds. One new category, no correct path (eg, a desired option did not exist in the computer interface, precipitating a workaround), was identified for computer-based workarounds. The most consistent reasons for workarounds across the three institutions were efficiency, memory, and awareness. Conclusions Consistent workarounds across institutions suggest common challenges in outpatient clinical settings and failures to accommodate these challenges in EHR design. An examination of workarounds provides insight into how providers adapt to limiting EHR systems. Part of the design process for computer interfaces should include user-centered methods particular to providers and healthcare settings to ensure uptake and usability. PMID:23492593

  10. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  11. Computer-aided microtomography with true 3-D display in electron microscopy.

    Science.gov (United States)

    Nelson, A C

    1986-01-01

    A novel research system has been designed to permit three-dimensional (3-D) viewing of high resolution image data from transmission electron microscopy (TEM) and scanning electron microscopy (SEM). The system consists of front-end primary data acquisition devices, such as TEM and SEM machines, which are equipped with computer-controlled specimen tilt stages. The output from these machines is in analogue form, where a video camera attached to the TEM provides the sequential analogue image output while the SEM direct video output is utilized. A 10 MHz digitizer transforms the video image to a digital array of 512 X 512 pixel units of 8 bits deep-stored in a frame buffer. Digital images from multiple projections are reconstructed into 3-D image boxes in a dedicated computer. Attached to the computer is a powerful true 3-D display device which has hardware for graphic manipulations including tilt and rotate on any axis and for probing the image with a 3-D cursor. Data editing and automatic contouring functions are used to enhance areas of interest, and specialized software is available for measurement of numbers, distances, areas, and volumes. With proper archiving of reconstructed image sequences, a dynamic 3-D presentation is possible. The microtomography system is highly versatile and can process image data on-line or from remote sites from which data records would typically be transported on computer tape, video tape, or floppy disk.

  12. Computational method for the correction of proximity effect in electron-beam lithography (Poster Paper)

    Science.gov (United States)

    Chang, Chih-Yuan; Owen, Gerry; Pease, Roger Fabian W.; Kailath, Thomas

    1992-07-01

    Dose correction is commonly used to compensate for the proximity effect in electron lithography. The computation of the required dose modulation is usually carried out using 'self-consistent' algorithms that work by solving a large number of simultaneous linear equations. However, there are two major drawbacks: the resulting correction is not exact, and the computation time is excessively long. A computational scheme, as shown in Figure 1, has been devised to eliminate this problem by the deconvolution of the point spread function in the pattern domain. The method is iterative, based on a steepest descent algorithm. The scheme has been successfully tested on a simple pattern with a minimum feature size 0.5 micrometers , exposed on a MEBES tool at 10 KeV in 0.2 micrometers of PMMA resist on a silicon substrate.

  13. SYSTEM DYNAMICS MODEL FOR EVALUATION OF REUSE OF ELECTRONIC WASTE ORIGINATED FROM PERSONAL COMPUTERS

    Directory of Open Access Journals (Sweden)

    Eugênio Simonetto

    2016-11-01

    Full Text Available Information and Communication Technologies (ICT are part of the day to day activities of a large part of world population, however its use involves a growing generation of electronic waste (ewaste. Due to the increasing technological innovation, it occurs that in a short time, the products become obsolete and have their life cycle reduced. The article aims to present the development, verification and validation of models of computational simulation for assessment of environmental and financial impacts caused by the extension of the life cycle of personal computers (PC through their remanufacturing. For the system modeling the System Dynamics theory was used. Results generated by the simulation model, show that the remanufacturing is a viable alternative for the reutilization of discarded computers and that it is possible, in advance, to discuss, assess and decide necessary measures for a better financial and environmental performance in the acquisition and use of ICT.

  14. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  15. Evolutionary and Neural Computing Based Decision Support System for Disease Diagnosis from Clinical Data Sets in Medical Practice.

    Science.gov (United States)

    Sudha, M

    2017-09-27

    As a recent trend, various computational intelligence and machine learning approaches have been used for mining inferences hidden in the large clinical databases to assist the clinician in strategic decision making. In any target data the irrelevant information may be detrimental, causing confusion for the mining algorithm and degrades the prediction outcome. To address this issue, this study attempts to identify an intelligent approach to assist disease diagnostic procedure using an optimal set of attributes instead of all attributes present in the clinical data set. In this proposed Application Specific Intelligent Computing (ASIC) decision support system, a rough set based genetic algorithm is employed in pre-processing phase and a back propagation neural network is applied in training and testing phase. ASIC has two phases, the first phase handles outliers, noisy data, and missing values to obtain a qualitative target data to generate appropriate attribute reduct sets from the input data using rough computing based genetic algorithm centred on a relative fitness function measure. The succeeding phase of this system involves both training and testing of back propagation neural network classifier on the selected reducts. The model performance is evaluated with widely adopted existing classifiers. The proposed ASIC system for clinical decision support has been tested with breast cancer, fertility diagnosis and heart disease data set from the University of California at Irvine (UCI) machine learning repository. The proposed system outperformed the existing approaches attaining the accuracy rate of 95.33%, 97.61%, and 93.04% for breast cancer, fertility issue and heart disease diagnosis.

  16. Industrial Applications of Evolutionary Algorithms

    CERN Document Server

    Sanchez, Ernesto; Tonda, Alberto

    2012-01-01

    This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the

  17. Computer optimization techniques for NASA Langley's CSI evolutionary model's real-time control system. [Controls/Structure Interaction

    Science.gov (United States)

    Elliott, Kenny B.; Ugoletti, Roberto; Sulla, Jeff

    1992-01-01

    The evolution and optimization of a real-time digital control system is presented. The control system is part of a testbed used to perform focused technology research on the interactions of spacecraft platform and instrument controllers with the flexible-body dynamics of the platform and platform appendages. The control system consists of Computer Automated Measurement and Control (CAMAC) standard data acquisition equipment interfaced to a workstation computer. The goal of this work is to optimize the control system's performance to support controls research using controllers with up to 50 states and frame rates above 200 Hz. The original system could support a 16-state controller operating at a rate of 150 Hz. By using simple yet effective software improvements, Input/Output (I/O) latencies and contention problems are reduced or eliminated in the control system. The final configuration can support a 16-state controller operating at 475 Hz. Effectively the control system's performance was increased by a factor of 3.

  18. Strategies to use tablet computers for collection of electronic patient-reported outcomes.

    Science.gov (United States)

    Schick-Makaroff, Kara; Molzahn, Anita

    2015-01-22

    Mobile devices are increasingly being used for data collection in research. However, many researchers do not have experience in collecting data electronically. Hence, the purpose of this short report was to identify issues that emerged in a study that incorporated electronic capture of patient-reported outcomes in clinical settings, and strategies used to address the issues. The issues pertaining to electronic patient-reported outcome data collection were captured qualitatively during a study on use of electronic patient-reported outcomes in two home dialysis units. Fifty-six patients completed three surveys on tablet computers, including the Kidney Disease Quality of Life-36, the Edmonton Symptom Assessment Scale, and a satisfaction measure. Issues that arose throughout the research process were recorded during ethics reviews, implementation process, and data collection. Four core issues emerged including logistics of technology, security, institutional and financial support, and electronic design. Although use of mobile devices for data collection has many benefits, it also poses new challenges for researchers. Advance consideration of possible issues that emerge in the process, and strategies that can help address these issues, may prevent disruption and enhance validity of findings.

  19. Optimization of Bioethanol In Silico Production Process in a Fed-Batch Bioreactor Using Non-Linear Model Predictive Control and Evolutionary Computation Techniques

    Directory of Open Access Journals (Sweden)

    Hanniel Ferreira Sarmento de Freitas

    2017-11-01

    Full Text Available Due to growing worldwide energy demand, the search for diversification of the energy matrix stands out as an important research topic. Bioethanol represents a notable alternative of renewable and environmental-friendly energy sources extracted from biomass, the bioenergy. Thus, the assurance of optimal growth conditions in the fermenter through operational variables manipulation is cardinal for the maximization of the ethanol production process yield. The current work focuses in the determination of optimal control scheme for the fermenter feed rate and batch end-time, evaluating different parametrization profiles, and comparing evolutionary computation techniques, the genetic algorithm (GA and differential evolution (DE, using a dynamic real-time optimization (DRTO approach for the in silico ethanol production optimization. The DRTO was able to optimize the reactor feed rate considering disturbances in the process input. Open-loop tests results obtained for the algorithms were superior to several works presented in the literature. The results indicate that the interaction between the intervals of DRTO cycles and parametrization profile is more significant for the GA, both in terms of ethanol productivity and batch time. In general lines, the present work presents a methodology for control and optimization studies applicable to other bioenergy generation systems.

  20. An Improved Electron Pre-Sheath Model for TSS-1R Current Enhancement Computations

    Directory of Open Access Journals (Sweden)

    Chunpei Cai

    2017-03-01

    Full Text Available This report presents improvements of investigations on the Tethered Satellite System (TSS-1R electron current enhancement due to magnetic limited collections. New analytical expressions are obtained for the potential and temperature changes across the pre-sheath. The mathematical treatments in this work are more rigorous than one past approach. More experimental measurements collected in the ionosphere during the TSS-1R mission are adopted for validations. The relations developed in this work offer two bounding curves for these data points quite successfully; the average of these two curves is close to the curve-fitting results for the measurements; and an average of 2.95 times larger than the Parker-Murphy theory is revealed. The results indicate that including the pre-sheath analysis is important to compute the electron current enhancement due to magnetic limitations.

  1. An electron beam linear scanning mode for industrial limited-angle nano-computed tomography

    Science.gov (United States)

    Wang, Chengxiang; Zeng, Li; Yu, Wei; Zhang, Lingli; Guo, Yumeng; Gong, Changcheng

    2018-01-01

    Nano-computed tomography (nano-CT), which utilizes X-rays to research the inner structure of some small objects and has been widely utilized in biomedical research, electronic technology, geology, material sciences, etc., is a high spatial resolution and non-destructive research technique. A traditional nano-CT scanning model with a very high mechanical precision and stability of object manipulator, which is difficult to reach when the scanned object is continuously rotated, is required for high resolution imaging. To reduce the scanning time and attain a stable and high resolution imaging in industrial non-destructive testing, we study an electron beam linear scanning mode of nano-CT system that can avoid mechanical vibration and object movement caused by the continuously rotated object. Furthermore, to further save the scanning time and study how small the scanning range could be considered with acceptable spatial resolution, an alternating iterative algorithm based on ℓ0 minimization is utilized to limited-angle nano-CT reconstruction problem with the electron beam linear scanning mode. The experimental results confirm the feasibility of the electron beam linear scanning mode of nano-CT system.

  2. Electron-spin-resonance transistors for quantum computing in silicon-germanium heterostructures

    Science.gov (United States)

    Vrijen, Rutger; Yablonovitch, Eli; Wang, Kang; Jiang, Hong Wen; Balandin, Alex; Roychowdhury, Vwani; Mor, Tal; Divincenzo, David

    2000-07-01

    We apply the full power of modern electronic band-structure engineering and epitaxial heterostructures to design a transistor that can sense and control a single-donor electron spin. Spin-resonance transistors may form the technological basis for quantum information processing. One- and two-qubit operations are performed by applying a gate bias. The bias electric field pulls the electron wave function away from the dopant ion into layers of different alloy composition. Owing to the variation of the g factor (Si:g=1.998,Ge:g=1.563), this displacement changes the spin Zeeman energy, allowing single-qubit operations. By displacing the electron even further, the overlap with neighboring qubits is affected, which allows two-qubit operations. Certain silicon-germanium alloys allow a qubit spacing as large as 200 nm, which is well within the capabilities of current lithographic techniques. We discuss manufacturing limitations and issues regarding scaling up to a large size computer.

  3. Evolutionary Awareness

    Directory of Open Access Journals (Sweden)

    Gregory Gorelik

    2014-10-01

    Full Text Available In this article, we advance the concept of “evolutionary awareness,” a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities—which we refer to as “intergenerational extended phenotypes”—by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.

  4. Evolutionary macroecology

    Directory of Open Access Journals (Sweden)

    José Alexandre F. Diniz-Filho

    2013-10-01

    Full Text Available Macroecology focuses on ecological questions at broad spatial and temporal scales, providing a statistical description of patterns in species abundance, distribution and diversity. More recently, historical components of these patterns have begun to be investigated more deeply. We tentatively refer to the practice of explicitly taking species history into account, both analytically and conceptually, as ‘evolutionary macroecology’. We discuss how the evolutionary dimension can be incorporated into macroecology through two orthogonal and complementary data types: fossils and phylogenies. Research traditions dealing with these data have developed more‐or‐less independently over the last 20–30 years, but merging them will help elucidate the historical components of diversity gradients and the evolutionary dynamics of species’ traits. Here we highlight conceptual and methodological advances in merging these two research traditions and review the viewpoints and toolboxes that can, in combination, help address patterns and unveil processes at temporal and spatial macro‐scales.

  5. Evolutionary Expectations

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary......, they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical......The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...

  6. [Evolutionary medicine].

    Science.gov (United States)

    Wjst, M

    2013-12-01

    Evolutionary medicine allows new insights into long standing medical problems. Are we "really stoneagers on the fast lane"? This insight might have enormous consequences and will allow new answers that could never been provided by traditional anthropology. Only now this is made possible using data from molecular medicine and systems biology. Thereby evolutionary medicine takes a leap from a merely theoretical discipline to practical fields - reproductive, nutritional and preventive medicine, as well as microbiology, immunology and psychiatry. Evolutionary medicine is not another "just so story" but a serious candidate for the medical curriculum providing a universal understanding of health and disease based on our biological origin. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Identify and rank key factors influencing the adoption of cloud computing for a healthy Electronics

    Directory of Open Access Journals (Sweden)

    Javad Shukuhy

    2015-02-01

    Full Text Available Cloud computing as a new technology with Internet infrastructure and new approaches can be significant benefits in providing medical services electronically. Aplying this technology in E-Health requires consideration of various factors. The main objective of this study is to identify and rank the factors influencing the adoption of e-health cloud. Based on the Technology-Organization-Environment (TOE framework and Human-Organization-Technology fit (HOT-fit model, 16 sub-factors were identified in four major factors. With survey of 60 experts, academics and experts in health information technology and with the help of fuzzy analytic hierarchy process had ranked these sub-factors and factors. In the literature, considering newness this study, no internal or external study, have not alluded these number of criteria. The results show that when deciding to adopt cloud computing in E-Health, respectively, must be considered technological, human, organizational and environmental factors.

  8. Teaching strategies applied to teaching computer networks in Engineering in Telecommunications and Electronics

    Directory of Open Access Journals (Sweden)

    Elio Manuel Castañeda-González

    2016-07-01

    Full Text Available Because of the large impact that today computer networks, their study in related fields such as Telecommunications Engineering and Electronics is presented to the student with great appeal. However, by digging in content, lacking a strong practical component, you can make this interest decreases considerably. This paper proposes the use of teaching strategies and analogies, media and interactive applications that enhance the teaching of discipline networks and encourage their study. It is part of an analysis of how the teaching of the discipline process is performed and then a description of each of these strategies is done with their respective contribution to student learning.

  9. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime

    Energy Technology Data Exchange (ETDEWEB)

    Baart, T. A.; Vandersypen, L. M. K. [QuTech, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Kavli Institute of Nanoscience, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Eendebak, P. T. [QuTech, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Netherlands Organisation for Applied Scientific Research (TNO), P.O. Box 155, 2600 AD Delft (Netherlands); Reichl, C.; Wegscheider, W. [Solid State Physics Laboratory, ETH Zürich, 8093 Zürich (Switzerland)

    2016-05-23

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the double quantum dots into the single-electron regime. The algorithm only requires (1) prior knowledge of the gate design and (2) the pinch-off value of the single gate T that is shared by all the quantum dots. This work significantly alleviates the user effort required to tune multiple quantum dot devices.

  10. Development of Computer-Based Training to Supplement Lessons in Fundamentals of Electronics

    Directory of Open Access Journals (Sweden)

    Ian P. Benitez

    2016-05-01

    Full Text Available Teaching Fundamentals of Electronics allow students to familiarize with basic electronics concepts, acquire skills in the use of multi-meter test instrument, and develop mastery in testing basic electronic components. Actual teaching and doing observations during practical activities on components pin identification and testing showed that the lack of skills of new students in testing components can lead to incorrect fault diagnosis and wrong pin connection during in-circuit replacement of the defective parts. With the aim of reinforcing students with concrete understanding of the concepts of components applied in the actual test and measurement, a Computer-Based Training was developed. The proponent developed the learning modules (courseware utilizing concept mapping and storyboarding instructional design. Developing a courseware as simulated, activity-based and interactive as possible was the primary goal to resemble the real-world process. A Local area network (LAN-based learning management system was also developed to use in administering the learning modules. The Paired Sample T-Test based on the pretest and post-test result was used to determine whether the students achieved learning after taking the courseware. The result revealed that there is a significant achievement of the students after studying the learning module. The E-learning content was validated by the instructors in terms of contents, activities, assessment and format with a grand weighted mean of 4.35 interpreted as Sufficient. Based from the evaluation result, supplementing with the proposed computer-based training can enhance the teachinglearning process in electronic fundamentals.

  11. Investigating the need for clinicians to use tablet computers with a newly envisioned electronic health record.

    Science.gov (United States)

    Saleem, Jason J; Savoy, April; Etherton, Gale; Herout, Jennifer

    2018-02-01

    The Veterans Health Administration (VHA) has deployed a large number of tablet computers in the last several years. However, little is known about how clinicians may use these devices with a newly planned Web-based electronic health record (EHR), as well as other clinical tools. The objective of this study was to understand the types of use that can be expected of tablet computers versus desktops. Semi-structured interviews were conducted with 24 clinicians at a Veterans Health Administration (VHA) Medical Center. An inductive qualitative analysis resulted in findings organized around recurrent themes of: (1) Barriers, (2) Facilitators, (3) Current Use, (4) Anticipated Use, (5) Patient Interaction, and (6) Connection. Our study generated several recommendations for the use of tablet computers with new health information technology tools being developed. Continuous connectivity for the mobile device is essential to avoid interruptions and clinician frustration. Also, making a physical keyboard available as an option for the tablet was a clear desire from the clinicians. Larger tablets (e.g., regular size iPad as compared to an iPad mini) were preferred. Being able to use secure messaging tools with the tablet computer was another consistent finding. Finally, more simplicity is needed for accessing patient data on mobile devices, while balancing the important need for adequate security. Published by Elsevier B.V.

  12. Meeting and Working on an Electronic Social Space: Behavioural Considerations and Implications for Cross-Cultural End User Computing.

    Science.gov (United States)

    Qureshi, Sajda

    1995-01-01

    Analysis of behavior on an electronic social space (electronic bulletin boards) revealed complex linages in which a few types of behaviors occurred most frequently. Interpretation indicates that cross-cultural implications for end-user computing involve practical considerations relating to three types of adaptation: technological, work, and…

  13. E-commerce, paper and energy use: a case study concerning a Dutch electronic computer retailer

    Energy Technology Data Exchange (ETDEWEB)

    Hoogeveen, M.J.; Reijnders, L. [Open University Netherlands, Heerlen (Netherlands)

    2002-07-01

    Impacts of the application of c-commerce on paper and energy use are analysed in a case study concerning a Dutch electronic retailer (e-tailer) of computers. The estimated use of paper associated with the e-tailer concerned was substantially reduced if compared with physical retailing or traditional mail-order retailing. However, the overall effect of e-tailing on paper use strongly depends on customer behaviour. Some characteristics of c-commerce, as practised by the e-tailer concerned, such as diminished floor space requirements, reduced need for personal transport and simplified logistics, improve energy efficiency compared with physical retailing. Substitution of paper information by online information has an energetic effect that is dependent on the time of online information perusal and the extent to which downloaded information is printed. Increasing distances from producers to consumers, outsourcing, and increased use of computers, associated equipment and electronic networks are characteristics of e-commerce that may have an upward effect on energy use. In this case study, the upward effects thereof on energy use were less than the direct energy efficiency gains. However, the indirect effects associated with increased buying power and the rebound effect on transport following from freefalling travel time, greatly exceeded direct energy efficiency gains. (author)

  14. Data mining technique for a secure electronic payment transaction using MJk-RSA in mobile computing

    Science.gov (United States)

    G. V., Ramesh Babu; Narayana, G.; Sulaiman, A.; Padmavathamma, M.

    2012-04-01

    Due to the evolution of the Electronic Learning (E-Learning), one can easily get desired information on computer or mobile system connected through Internet. Currently E-Learning materials are easily accessible on the desktop computer system, but in future, most of the information shall also be available on small digital devices like Mobile, PDA, etc. Most of the E-Learning materials are paid and customer has to pay entire amount through credit/debit card system. Therefore, it is very important to study about the security of the credit/debit card numbers. The present paper is an attempt in this direction and a security technique is presented to secure the credit/debit card numbers supplied over the Internet to access the E-Learning materials or any kind of purchase through Internet. A well known method i.e. Data Cube Technique is used to design the security model of the credit/debit card system. The major objective of this paper is to design a practical electronic payment protocol which is the safest and most secured mode of transaction. This technique may reduce fake transactions which are above 20% at the global level.

  15. In vitro evaluation of three electronic apex locators using conventional methods and cone beam computed tomography.

    Science.gov (United States)

    Mrasori, S; Budina, R; Dragidella, F

    2014-01-01

    The aim of this paper was to evaluate the measurement accuracy of three electronic apex locators by digital radiography, stereomicroscope and cone beam computed tomography (CBCT). This in vitro experimental analytic-descriptive study included 90 extracted permanent teeth with mature apices from the inter-canine region, divided into three groups. In vitro electronic root canal measurement was utilized using three different apex locators: ProPex Dentsply, Apex NRG Blue and Romi Apex 15A. After digital radiographic imaging and measurements Cone Beam Computer Tomography (CBCT) imaging with voxel edge size 0.125 mm was utilized and finally the apical portion of the root is grounded (5 mm) along its axis to prepare it for stereomicroscopic measurements. The performed test of significance shows that there is no difference between the apex locators and the control length as measured by computed digital radiography (CDR), P-values of the t-tests are all >0.05. The t-tests showed that there is no significant differences between the measurements conducted by stereomicroscope and the measurements results obtained using CBCT; measurements performed by the three apex locators (Propex, NRG-Blue, and Rami Apex), were accurate within 0.5 mm 87%, 93%, and 87% of the time, respectively. The statistical analysis showed no significant differences between the three tested apex locators (P>0.05). Based on the conditions of the present study, it can be concluded that all three apex locators (ProPex Dentsply, Apex NRG Blu and Romi Apex 15A) have demonstrated accurate and dependable measurements performed in vitro conditions.

  16. Internal photon and electron dosimetry of the newborn patient—a hybrid computational phantom study

    Science.gov (United States)

    Wayson, Michael; Lee, Choonsik; Sgouros, George; Treves, S. Ted; Frey, Eric; Bolch, Wesley E.

    2012-03-01

    Estimates of radiation absorbed dose to organs of the nuclear medicine patient are a requirement for administered activity optimization and for stochastic risk assessment. Pediatric patients, and in particular the newborn child, represent that portion of the patient population where such optimization studies are most crucial owing to the enhanced tissue radiosensitivities and longer life expectancies of this patient subpopulation. In cases where whole-body CT imaging is not available, phantom-based calculations of radionuclide S values—absorbed dose to a target tissue per nuclear transformation in a source tissue—are required for dose and risk evaluation. In this study, a comprehensive model of electron and photon dosimetry of the reference newborn child is presented based on a high-resolution hybrid-voxel phantom from the University of Florida (UF) patient model series. Values of photon specific absorbed fraction (SAF) were assembled for both the reference male and female newborn using the radiation transport code MCNPX v2.6. Values of electron SAF were assembled in a unique and time-efficient manner whereby the collisional and radiative components of organ dose--for both self- and cross-dose terms—were computed separately. Dose to the newborn skeletal tissues were assessed via fluence-to-dose response functions reported for the first time in this study. Values of photon and electron SAFs were used to assemble a complete set of S values for some 16 radionuclides commonly associated with molecular imaging of the newborn. These values were then compared to those available in the OLINDA/EXM software. S value ratios for organ self-dose ranged from 0.46 to 1.42, while similar ratios for organ cross-dose varied from a low of 0.04 to a high of 3.49. These large discrepancies are due in large part to the simplistic organ modeling in the stylized newborn model used in the OLINDA/EXM software. A comprehensive model of internal dosimetry is presented in this study for

  17. Electron Beam Melting and Refining of Metals: Computational Modeling and Optimization

    Directory of Open Access Journals (Sweden)

    Veliko Donchev

    2013-10-01

    Full Text Available Computational modeling offers an opportunity for a better understanding and investigation of thermal transfer mechanisms. It can be used for the optimization of the electron beam melting process and for obtaining new materials with improved characteristics that have many applications in the power industry, medicine, instrument engineering, electronics, etc. A time-dependent 3D axis-symmetrical heat model for simulation of thermal transfer in metal ingots solidified in a water-cooled crucible at electron beam melting and refining (EBMR is developed. The model predicts the change in the temperature field in the casting ingot during the interaction of the beam with the material. A modified Pismen-Rekford numerical scheme to discretize the analytical model is developed. These equation systems, describing the thermal processes and main characteristics of the developed numerical method, are presented. In order to optimize the technological regimes, different criteria for better refinement and obtaining dendrite crystal structures are proposed. Analytical problems of mathematical optimization are formulated, discretized and heuristically solved by cluster methods. Using important for the practice simulation results, suggestions can be made for EBMR technology optimization. The proposed tool is important and useful for studying, control, optimization of EBMR process parameters and improving of the quality of the newly produced materials.

  18. Update on Electronic Dental Record and Clinical Computing Adoption Among Dental Practices in the United States.

    Science.gov (United States)

    Acharya, Amit; Schroeder, Dixie; Schwei, Kelsey; Chyou, Po-Huang

    2017-12-11

    The study sought to re-characterize trends and factors affecting electronic dental record (EDR) and technologies adoption by dental practices and the impact of the Health Information Technology for Economic and Clinical Health (HITECH) act on adoption rates through 2012. A 39-question survey was disseminated nationally over 3 months using a novel, statistically-modeled approach informed by early response rates to achieve a predetermined sample. EDR adoption rate for clinical support was 52%. Adoption rates were higher among: 1) younger dentists; 2) dentists ≤ 15 years in practice; 3) females; and 4) group practices. Top barriers to adoption were EDR cost/expense, cost-benefit ratio, electronic format conversion, and poor EDR usability. Awareness of the Federal HITECH incentive program was low. The rate of chairside computer implementation was 72%. Adoption of EDR in dental offices in the United States was higher in 2012 than electronic health record adoption rates in medical offices and was not driven by the HITECH program. Patient portal adoption among dental practices in the United States remained low. © 2017 Marshfield Clinic.

  19. Optimal Control of Evolutionary Dynamics

    CERN Document Server

    Chakrabarti, Raj; McLendon, George

    2008-01-01

    Elucidating the fitness measures optimized during the evolution of complex biological systems is a major challenge in evolutionary theory. We present experimental evidence and an analytical framework demonstrating how biochemical networks exploit optimal control strategies in their evolutionary dynamics. Optimal control theory explains a striking pattern of extremization in the redox potentials of electron transport proteins, assuming only that their fitness measure is a control objective functional with bounded controls.

  20. Image Processor Electronics (IPE): The High-Performance Computing System for NASA SWIFT Mission

    Science.gov (United States)

    Nguyen, Quang H.; Settles, Beverly A.

    2003-01-01

    Gamma Ray Bursts (GRBs) are believed to be the most powerful explosions that have occurred in the Universe since the Big Bang and are a mystery to the scientific community. Swift, a NASA mission that includes international participation, was designed and built in preparation for a 2003 launch to help to determine the origin of Gamma Ray Bursts. Locating the position in the sky where a burst originates requires intensive computing, because the duration of a GRB can range between a few milliseconds up to approximately a minute. The instrument data system must constantly accept multiple images representing large regions of the sky that are generated by sixteen gamma ray detectors operating in parallel. It then must process the received images very quickly in order to determine the existence of possible gamma ray bursts and their locations. The high-performance instrument data computing system that accomplishes this is called the Image Processor Electronics (IPE). The IPE was designed, built and tested by NASA Goddard Space Flight Center (GSFC) in order to meet these challenging requirements. The IPE is a small size, low power and high performing computing system for space applications. This paper addresses the system implementation and the system hardware architecture of the IPE. The paper concludes with the IPE system performance that was measured during end-to-end system testing.

  1. Open Issues in Evolutionary Robotics.

    Science.gov (United States)

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  2. Desiderata for computable representations of electronic health records-driven phenotype algorithms.

    Science.gov (United States)

    Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Denny, Joshua C; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A

    2015-11-01

    Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. © The Author 2015. Published by Oxford University Press on behalf of the American Medical

  3. Fundamentals of natural computing basic concepts, algorithms, and applications

    CERN Document Server

    de Castro, Leandro Nunes

    2006-01-01

    Introduction A Small Sample of Ideas The Philosophy of Natural Computing The Three Branches: A Brief Overview When to Use Natural Computing Approaches Conceptualization General Concepts PART I - COMPUTING INSPIRED BY NATURE Evolutionary Computing Problem Solving as a Search Task Hill Climbing and Simulated Annealing Evolutionary Biology Evolutionary Computing The Other Main Evolutionary Algorithms From Evolutionary Biology to Computing Scope of Evolutionary Computing Neurocomputing The Nervous System Artif

  4. Evolutionary institutionalism.

    Science.gov (United States)

    Fürstenberg, Dr Kai

    Institutions are hard to define and hard to study. Long prominent in political science have been two theories: Rational Choice Institutionalism (RCI) and Historical Institutionalism (HI). Arising from the life sciences is now a third: Evolutionary Institutionalism (EI). Comparative strengths and weaknesses of these three theories warrant review, and the value-to-be-added by expanding the third beyond Darwinian evolutionary theory deserves consideration. Should evolutionary institutionalism expand to accommodate new understanding in ecology, such as might apply to the emergence of stability, and in genetics, such as might apply to political behavior? Core arguments are reviewed for each theory with more detailed exposition of the third, EI. Particular attention is paid to EI's gene-institution analogy; to variation, selection, and retention of institutional traits; to endogeneity and exogeneity; to agency and structure; and to ecosystem effects, institutional stability, and empirical limitations in behavioral genetics. RCI, HI, and EI are distinct but complementary. Institutional change, while amenable to rational-choice analysis and, retrospectively, to criticaljuncture and path-dependency analysis, is also, and importantly, ecological. Stability, like change, is an emergent property of institutions, which tend to stabilize after change in a manner analogous to allopatric speciation. EI is more than metaphorically biological in that institutional behaviors are driven by human behaviors whose evolution long preceded the appearance of institutions themselves.

  5. Fast Band-Structure Computation for Phononic and Electronic Waves in Crystals

    Science.gov (United States)

    Krattiger, Dimitri

    The band structure is a frequency/energy versus wave vector/momentum relationship that fundamentally describes the nature of wave motion in a periodic medium. It is immensely valuable for predicting and understanding the properties of electronic, photonic, and phononic materials, and is typically computed numerically. For materials with large unit cells, such as nanostructured supercells for example, band-structure computation is very costly. This inhibits the ability to feasibly analyze new material systems with potentially extraordinary properties. This thesis describes a novel unit-cell model-reduction technique for band-structure calculations that is capable of lowering computational costs by one or two orders of magnitude with practically insignificant loss of accuracy. This new methodology, termed Bloch mode synthesis, is based on unit-cell modal analysis. It begins from a free-boundary unit-cell model. Before periodic boundary conditions are applied, this free unit cell behaves as though it has been cut out from its periodic surroundings. A truncated set of normal mode shapes is then used to compactly represent the interior portion of the unit cell while retaining nearly all of the dynamically important information. A Ritz basis for the unit cell is formed by combining the interior modes with a second set of modes that preserves the flexibility needed to enforce a Bloch wave solution in the unit cell. Residual mode enhancement and interface modal reduction improve performance further. With this highly reduced model, Bloch boundary conditions corresponding to waves of any directions and wavelength can be applied to very quickly obtain the band structure. Bloch mode synthesis is derived in the context of elastic wave propagation in phononic crystals and metamaterials, but the framework is also well suited for other types of waves. It shows particular promise in speeding up electronic structure calculations - a central problem in computational materials science

  6. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  7. Computer simulation of electron-positron pair production by channeling radiation in amorphous converter

    Science.gov (United States)

    Abdrashitov, S. V.; Bogdanov, O. V.; Dabagov, S. B.; Pivovarov, Yu L.; Tukhfatullin, T. A.

    2016-07-01

    We consider the radiator-converter approach at 200 MeV channeled electrons (the SPARC_LAB LNF facility energies) for the case of using W crystalline radiator and W amorphous converter. A comparison of the positron production by the axial channeling radiation and the bremsstrahlung is performed. The positron stopping in the convertor is studied by means of computer simulations. It is shown that for the maximum yield of positrons the thickness of the W amorphous converter should be taken 0.35 cm in the case of using the axial channeling radiation resulting to total yield of positrons 5 10-3 e+/e- and 0.71 cm in the case of using the bremsstrahlung resulting to total yield of positrons 3.3 10-3 e+/e-.

  8. Electronic Structure Calculations and Adaptation Scheme in Multi-core Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Seshagiri, Lakshminarasimhan; Sosonkina, Masha; Zhang, Zhao

    2009-05-20

    Multi-core processing environments have become the norm in the generic computing environment and are being considered for adding an extra dimension to the execution of any application. The T2 Niagara processor is a very unique environment where it consists of eight cores having a capability of running eight threads simultaneously in each of the cores. Applications like General Atomic and Molecular Electronic Structure (GAMESS), used for ab-initio molecular quantum chemistry calculations, can be good indicators of the performance of such machines and would be a guideline for both hardware designers and application programmers. In this paper we try to benchmark the GAMESS performance on a T2 Niagara processor for a couple of molecules. We also show the suitability of using a middleware based adaptation algorithm on GAMESS on such a multi-core environment.

  9. Three-dimensional computed tomography angiography of coronary artery bypass graft with electron beam tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hoshi, Toshiko; Yamauchi, Tatsuo; Kanauchi, Tetsu; Konno, Miyuki; Imai, Kamon; Suwa, Jiro; Onoguchi, Katsuhisa; Hashimoto, Kazuhiro; Horie, Toshinobu [Saitama Cardiovascular and Respiratory Center, Konan (Japan)

    2001-10-01

    Assessment of coronary artery bypass graft patency by three-dimensional reconstructed computed tomography angiography (3D-CTA) derived from electrocardiography-gated contrast-enhanced electron beam tomography (EBT) was evaluated. Thirty-nine patients with 99 grafts (45 arterial grafts and 54 venous grafts) underwent 3D-CTA and selective coronary angiography within a 3-week interval. 3D-CTA images of the coronary bypass grafts were compared with the coronary angiography images used as the control. 3D-CTA defined 42 of 44 arterial grafts as patent (sensitivity: 95%), all 47 venous grafts as patent (sensitivity: 100%) and all 7 venous grafts as occlusive (specificity: 100%). The overall sensitivity and specificity were 98% and 88%, respectively. 3D-CTA is an useful noninvasive technique with adequate sensitivity and specificity to assess coronary artery bypass graft patency. (author)

  10. Modeling of temperature profiles in an environmental transmission electron microscope using computational fluid dynamics

    DEFF Research Database (Denmark)

    Mortensen, Peter Mølgaard; Jensen, Anker Degn; Hansen, Thomas Willum

    2015-01-01

    The temperature and velocity field, pressure distribution, and the temperature variation across the sample region inside an environmental transmission electron microscope (ETEM) have been modeled by means of computational fluid dynamics (CFD). Heating the sample area by a furnace type TEM holder ...... difference over the TEM grid is less than 5. °C, at what must be considered typical conditions, and it is concluded that the conditions on the sample grid in the ETEM can be considered as isothermal during general use....... gives rise to temperature gradients over the sample area. Three major mechanisms have been identified with respect to heat transfer in the sample area: radiation from the grid, conduction in the grid, and conduction in the gas. A parameter sensitivity analysis showed that the sample temperature...

  11. COMPUTATIONAL EFFICIENCY OF A MODIFIED SCATTERING KERNEL FOR FULL-COUPLED PHOTON-ELECTRON TRANSPORT PARALLEL COMPUTING WITH UNSTRUCTURED TETRAHEDRAL MESHES

    Directory of Open Access Journals (Sweden)

    JONG WOON KIM

    2014-04-01

    In this paper, we introduce a modified scattering kernel approach to avoid the unnecessarily repeated calculations involved with the scattering source calculation, and used it with parallel computing to effectively reduce the computation time. Its computational efficiency was tested for three-dimensional full-coupled photon-electron transport problems using our computer program which solves the multi-group discrete ordinates transport equation by using the discontinuous finite element method with unstructured tetrahedral meshes for complicated geometrical problems. The numerical tests show that we can improve speed up to 17∼42 times for the elapsed time per iteration using the modified scattering kernel, not only in the single CPU calculation but also in the parallel computing with several CPUs.

  12. Modeling of temperature profiles in an environmental transmission electron microscope using computational fluid dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Mølgaard Mortensen, Peter [Department of Chemical and Biochemical Engineering, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark); Willum Hansen, Thomas [Center for Electron Nanoscopy, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark); Birkedal Wagner, Jakob, E-mail: jakob.wagner@cen.dtu.dk [Center for Electron Nanoscopy, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark); Degn Jensen, Anker, E-mail: aj@kt.dtu.dk [Department of Chemical and Biochemical Engineering, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2015-05-15

    The temperature and velocity field, pressure distribution, and the temperature variation across the sample region inside an environmental transmission electron microscope (ETEM) have been modeled by means of computational fluid dynamics (CFD). Heating the sample area by a furnace type TEM holder gives rise to temperature gradients over the sample area. Three major mechanisms have been identified with respect to heat transfer in the sample area: radiation from the grid, conduction in the grid, and conduction in the gas. A parameter sensitivity analysis showed that the sample temperature was affected by the conductivity of the gas, the emissivity of the sample grid, and the conductivity of the grid. Ideally the grid should be polished and made from a material with good conductivity, e.g. copper. With hydrogen gas, which has the highest conductivity of the gases studied, the temperature difference over the TEM grid is less than 5 °C, at what must be considered typical conditions, and it is concluded that the conditions on the sample grid in the ETEM can be considered as isothermal during general use. - Highlights: • Computational fluid dynamics used for mapping flow and temperature in ETEM setup. • Temperature gradient across TEM grid in furnace based heating holder very small in ETEM. • Conduction from TEM grid and gas in addition to radiation from TEM grid most important. • Pressure drop in ETEM limited to the pressure limiting apertures.

  13. Oversized or Undersized? Defining the Right-sized Computer Center for Electronic Funds Transfer Processing

    Directory of Open Access Journals (Sweden)

    ANDRADE, A.

    2013-06-01

    Full Text Available Electronic Funds Transfer represents an upward trend, which fosters the proximity among consumers and suppliers. Each transaction is sent to a Computer Center, in charge of decoding, processing and returning the results as fast as possible. Particularly, the present article covers the GetNet Company day-by-day, focusing on one of their subsystems. In the article, we model the incoming transaction volume and the corresponding processing to answer the following questions: (i how is the idleness of the company transaction system settings and what are the rates involed on that? (ii Given an annual growth of 20% in the transaction volume, which modifications should be made in the current Computer Center to fulfill the need in terms of transactions until 2020? The tests were based on transactions execution logs during one day, which corresponds to the greater volume of 2011. As expected, the results show that the 10 machines composing GetNet system are overestimated for the current situation, which could support the operational load with only 4 machines. In addition, the current configuration could be sustained, regarding the growth predicted before, until the middle of 2017 without loss of transactions.

  14. Electron beam diagnostic system using computed tomography and an annular sensor

    Science.gov (United States)

    Elmer, John W.; Teruya, Alan T.

    2015-08-11

    A system for analyzing an electron beam including a circular electron beam diagnostic sensor adapted to receive the electron beam, the circular electron beam diagnostic sensor having a central axis; an annular sensor structure operatively connected to the circular electron beam diagnostic sensor, wherein the sensor structure receives the electron beam; a system for sweeping the electron beam radially outward from the central axis of the circular electron beam diagnostic sensor to the annular sensor structure wherein the electron beam is intercepted by the annular sensor structure; and a device for measuring the electron beam that is intercepted by the annular sensor structure.

  15. Computed tomography as a source of electron density information for radiation treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Skrzynski, Witold; Slusarczyk-Kacprzyk, Wioletta; Bulski, Wojciech [Medical Physics Dept., Center of Oncology, Warsaw (Poland); Zielinska-Dabrowska, Sylwia; Wachowicz, Marta; Kukolowicz, Pawel F. [Medical Physics Dept., Holycross Cancer Center, Kielce (Poland)

    2010-06-15

    Purpose: to evaluate the performance of computed tomography (CT) systems of various designs as a source of electron density ({rho}{sub el}) data for treatment planning of radiation therapy. Material and methods: dependence of CT numbers on relative electron density of tissue-equivalent materials (HU-{rho}{sub el} relationship) was measured for several general-purpose CT systems (single-slice, multislice, wide-bore multislice), for radiotherapy simulators with a single-slice CT and kV CBCT (cone-beam CT) options, as well as for linear accelerators with kV and MV CBCT systems. Electron density phantoms of four sizes were used. Measurement data were compared with the standard HU-{rho}{sub el} relationships predefined in two commercial treatment-planning systems (TPS). Results: the HU-{rho}{sub el} relationships obtained with all of the general-purpose CT scanners operating at voltages close to 120 kV were very similar to each other and close to those predefined in TPS. Some dependency of HU values on tube voltage was observed for bone-equivalent materials. For a given tube voltage, differences in results obtained for different phantoms were larger than those obtained for different CT scanners. For radiotherapy simulators and for kV CBCT systems, the information on {rho}{sub el} was much less precise because of poor uniformity of images. For MV CBCT, the results were significantly different than for kV systems due to the differing energy spectrum of the beam. Conclusion: the HU-{rho}{sub el} relationships predefined in TPS can be used for general-purpose CT systems operating at voltages close to 120 kV. For nontypical imaging systems (e.g., CBCT), the relationship can be significantly different and, therefore, it should always be measured and carefully analyzed before using CT data for treatment planning. (orig.)

  16. The accuracy of molecular bond lengths computed by multireference electronic structure methods

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, Ron [Chemical Sciences and Engineering Division, Argonne National Laboratory, Argonne, IL 60439 (United States)], E-mail: shepard@tcg.anl.gov; Kedziora, Gary S. [High Performance Technologies Inc., 2435 5th Street, WPAFB, OH 45433 (United States); Lischka, Hans [Institute for Theoretical Chemistry, University of Vienna, Waehringerstrasse 17, A-1090 Vienna (Austria); Shavitt, Isaiah [Department of Chemistry, University of Illinois, 600 S. Mathews Avenue, Urbana, IL 61801 (United States); Mueller, Thomas [Juelich Supercomputer Centre, Research Centre Juelich, D-52425 Juelich (Germany); Szalay, Peter G. [Laboratory for Theoretical Chemistry, Institute of Chemistry, Eoetvoes Lorand University, P.O. Box 32, H-1518 Budapest (Hungary); Kallay, Mihaly [Department of Physical Chemistry and Materials Science, Budapest University of Technology and Economics, P.O. Box 91, H-1521 Budapest (Hungary); Seth, Michael [Department of Chemistry, University of Calgary, 2500 University Drive, N.W., Calgary, Alberta, T2N 1N4 (Canada)

    2008-06-16

    We compare experimental R{sub e} values with computed R{sub e} values for 20 molecules using three multireference electronic structure methods, MCSCF, MR-SDCI, and MR-AQCC. Three correlation-consistent orbital basis sets are used, along with complete basis set extrapolations, for all of the molecules. These data complement those computed previously with single-reference methods. Several trends are observed. The SCF R{sub e} values tend to be shorter than the experimental values, and the MCSCF values tend to be longer than the experimental values. We attribute these trends to the ionic contamination of the SCF wave function and to the corresponding systematic distortion of the potential energy curve. For the individual bonds, the MR-SDCI R{sub e} values tend to be shorter than the MR-AQCC values, which in turn tend to be shorter than the MCSCF values. Compared to the previous single-reference results, the MCSCF values are roughly comparable to the MP4 and CCSD methods, which are more accurate than might be expected due to the fact that these MCSCF wave functions include no extra-valence electron correlation effects. This suggests that static valence correlation effects, such as near-degeneracies and the ability to dissociate correctly to neutral fragments, play an important role in determining the shape of the potential energy surface, even near equilibrium structures. The MR-SDCI and MR-AQCC methods predict R{sub e} values with an accuracy comparable to, or better than, the best single-reference methods (MP4, CCSD, and CCSD(T)), despite the fact that triple and higher excitations into the extra-valence orbital space are included in the single-reference methods but are absent in the multireference wave functions. The computed R{sub e} values using the multireference methods tend to be smooth and monotonic with basis set improvement. The molecular structures are optimized using analytic energy gradients, and the timings for these calculations show the practical

  17. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  18. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records.

    Science.gov (United States)

    Duz, Marco; Marshall, John F; Parkin, Tim

    2017-06-29

    The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free

  19. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  20. Electronic Structure of the Perylene / Zinc Oxide Interface: A Computational Study of Photoinduced Electron Transfer and Impact of Surface Defects

    KAUST Repository

    Li, Jingrui

    2015-07-29

    The electronic properties of dye-sensitized semiconductor surfaces consisting of pery- lene chromophores chemisorbed on zinc oxide via different spacer-anchor groups, have been studied at the density-functional-theory level. The energy distributions of the donor states and the rates of photoinduced electron transfer from dye to surface are predicted. We evaluate in particular the impact of saturated versus unsaturated aliphatic spacer groups inserted between the perylene chromophore and the semiconductor as well as the influence of surface defects on the electron-injection rates.

  1. The effect of the electron scattering phase shifts upon the computational outcomes of the Low-Energy Electron Diffraction technique

    Science.gov (United States)

    Adas, Sonya; Meyers, Lisa; Caragiu, Mellita

    2009-04-01

    In a typical Low-Energy Electron Diffraction (LEED) investigation of a crystal surface, the electrons probing the surface are scattered by the atoms in the sample. The scattering process introduces phase shifts in the waves associated to the incoming electrons. An investigation of how these phase shifts influence the results of a LEED calculation are presented for the fairly complicated Cu(511) stepped surface. The phase shifts have been calculated using the Barbieri/Van Hove Phase Shift Package. The phase shifts considered correspond to copper atoms arranged in various planes of the copper crystal: (100), (111), and a close approximation of the (511) plane.

  2. The right mix to support electronic medical record training: classroom computer-based training and blended learning.

    Science.gov (United States)

    McCain, Christina L

    2008-01-01

    Staff development plays a crucial role in supporting clinicians to adapt to the ever changing technological advances in the healthcare setting. The quest to support staff in the implementation and ongoing optimization of an electronic medical record (EMR) led these staff development educators to computer based training and a blended learning approach building upon the traditional anchor of classroom learning and the advantages of computer-based training.

  3. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens.

    Science.gov (United States)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D; Volz, Kerstin

    2017-06-01

    We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. High-order epistasis shapes evolutionary trajectories.

    OpenAIRE

    Sailer, Zachary R.; Harms, Michael J.

    2017-01-01

    High-order epistasis-where the effect of a mutation is determined by interactions with two or more other mutations-makes small, but detectable, contributions to genotype-fitness maps. While epistasis between pairs of mutations is known to be an important determinant of evolutionary trajectories, the evolutionary consequences of high-order epistasis remain poorly understood. To determine the effect of high-order epistasis on evolutionary trajectories, we computationally removed high-order epis...

  5. High-order epistasis shapes evolutionary trajectories

    OpenAIRE

    Sailer, Zachary R.

    2017-01-01

    High-order epistasis?where the effect of a mutation is determined by interactions with two or more other mutations?makes small, but detectable, contributions to genotype-fitness maps. While epistasis between pairs of mutations is known to be an important determinant of evolutionary trajectories, the evolutionary consequences of high-order epistasis remain poorly understood. To determine the effect of high-order epistasis on evolutionary trajectories, we computationally removed high-order epis...

  6. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  7. A structural lattice model for electronic textile: an experimental and computational study

    NARCIS (Netherlands)

    Verberne, C.W.; Van Os, K.; Luitjens, S.B.

    2011-01-01

    Electronic textiles combine textiles with the functionality of electronic applications.To understand the mechanical issues of reliability, mechanical failure and compatibility of these electronic textiles, research has to be performed that focusses on the interplay of the textile with the electronic

  8. Building an electronic book on the Internet: ``CSEP -- an interdisciplinary syllabus for teaching computational science at the graduate level``

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C.E.; Strayer, M.R. [Oak Ridge National Lab., TN (United States); Umar, V.M. [Vanderbilt Univ., Nashville, TN (United States)

    1994-12-31

    The Computational Science Education Project was initiated in September 1991, by the Department of Energy to develop a syllabus for teaching interdisciplinary computational science. CSEP has two major activities. The writing and maintenance of an electronic book (e-book) and educational outreach to the computational science communities through presentations at professional society meetings, journal articles, and by training educators. The interdisciplinary nature of the project is intended to contribute to national technological competitiveness by producing a body of graduates with the necessary skills to operate effectively in high performance computing environments. The educational outreach guides and supports instructors in developing computational science courses and curricula at their institutions. The CSEP e-book provides valuable teaching material around which educators have built. The outreach not only introduces new educators to CSEP, but also establishes a synergistic relationship between CSEP authors, reviewers and users.

  9. Computer Archiving and Image Enhancement of Diagnostic Electron Micrographs Using Scanning Transmission Electron Microscope as Real-Time Digitizer

    Science.gov (United States)

    Okagaki, T.; Jones, M.H.; Clark, B.A.; Pan, T.; Ferro, J.M.; Hsing, R.; Tzou, K.H.

    1984-01-01

    Diagnostic electron micrographs were digitized in real time using a scanning transmission electron microscope (STEM) controlled by a devoted front end processor at a resolution of 1K × 1K × 8. Various methods of image enhancement produced satisfactory results. From our experience, a faster front end processor with a larger memory size and 2K × 2K or 4K × 4K spatial resolution of an image are desirable. In order to facilitate storage and retrieval of an image archive, efficient data compression is necessary. ImagesFig. 2Fig. 3

  10. Nanoscale RRAM-based synaptic electronics: toward a neuromorphic computing device

    Science.gov (United States)

    Park, Sangsu; Noh, Jinwoo; Choo, Myung-lae; Muqeem Sheri, Ahmad; Chang, Man; Kim, Young-Bae; Kim, Chang Jung; Jeon, Moongu; Lee, Byung-Geun; Lee, Byoung Hun; Hwang, Hyunsang

    2013-09-01

    Efforts to develop scalable learning algorithms for implementation of networks of spiking neurons in silicon have been hindered by the considerable footprints of learning circuits, which grow as the number of synapses increases. Recent developments in nanotechnologies provide an extremely compact device with low-power consumption. In particular, nanoscale resistive switching devices (resistive random-access memory (RRAM)) are regarded as a promising solution for implementation of biological synapses due to their nanoscale dimensions, capacity to store multiple bits and the low energy required to operate distinct states. In this paper, we report the fabrication, modeling and implementation of nanoscale RRAM with multi-level storage capability for an electronic synapse device. In addition, we first experimentally demonstrate the learning capabilities and predictable performance by a neuromorphic circuit composed of a nanoscale 1 kbit RRAM cross-point array of synapses and complementary metal-oxide-semiconductor neuron circuits. These developments open up possibilities for the development of ubiquitous ultra-dense, ultra-low-power cognitive computers.

  11. A computer control system for the PNC high power cw electron linac. Concept and hardware

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, T.; Hirano, K.; Takei, Hayanori; Nomura, Masahiro; Tani, S. [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Kato, Y.; Ishikawa, Y.

    1998-06-01

    Design and construction of a high power cw (Continuous Wave) electron linac for studying feasibility of nuclear waste transmutation was started in 1989 at PNC. The PNC accelerator (10 MeV, 20 mA average current, 4 ms pulse width, 50 Hz repetition) is dedicated machine for development of the high current acceleration technology in future need. The computer control system is responsible for accelerator control and supporting the experiment for high power operation. The feature of the system is the measurements of accelerator status simultaneously and modularity of software and hardware for easily implemented for modification or expansion. The high speed network (SCRAM Net {approx} 15 MB/s), Ethernet, and front end processors (Digital Signal Processor) were employed for the high speed data taking and control. The system was designed to be standard modules and software implemented man machine interface. Due to graphical-user-interface and object-oriented-programming, the software development environment is effortless programming and maintenance. (author)

  12. Examination of Scanning Electron Microscope and Computed Tomography Images of PICA

    Science.gov (United States)

    Lawson, John W.; Stackpoole, Margaret M.; Shklover, Valery

    2010-01-01

    Micrographs of PICA (Phenolic Impregnated Carbon Ablator) taken using a Scanning Electron Microscope (SEM) and 3D images taken with a Computed Tomography (CT) system are examined. PICA is a carbon fiber based composite (Fiberform ) with a phenolic polymer matrix. The micrographs are taken at different surface depths and at different magnifications in a sample after arc jet testing and show different levels of oxidative removal of the charred matrix (Figs 1 though 13). CT scans, courtesy of Xradia, Inc. of Concord CA, were captured for samples of virgin PICA, charred PICA and raw Fiberform (Fig. 14). We use these images to calculate the thermal conductivity (TC) of these materials using correlation function (CF) methods. CF methods give a mathematical description of how one material is embedded in another and is thus ideally suited for modeling composites like PICA. We will evaluate how the TC of the materials changes as a function of surface depth. This work is in collaboration with ETH-Zurich, which has expertise in high temperature materials and TC modeling (including CF methods).

  13. Nanoscale RRAM-based synaptic electronics: toward a neuromorphic computing device.

    Science.gov (United States)

    Park, Sangsu; Noh, Jinwoo; Choo, Myung-Lae; Sheri, Ahmad Muqeem; Chang, Man; Kim, Young-Bae; Kim, Chang Jung; Jeon, Moongu; Lee, Byung-Geun; Lee, Byoung Hun; Hwang, Hyunsang

    2013-09-27

    Efforts to develop scalable learning algorithms for implementation of networks of spiking neurons in silicon have been hindered by the considerable footprints of learning circuits, which grow as the number of synapses increases. Recent developments in nanotechnologies provide an extremely compact device with low-power consumption.In particular, nanoscale resistive switching devices (resistive random-access memory (RRAM)) are regarded as a promising solution for implementation of biological synapses due to their nanoscale dimensions, capacity to store multiple bits and the low energy required to operate distinct states. In this paper, we report the fabrication, modeling and implementation of nanoscale RRAM with multi-level storage capability for an electronic synapse device. In addition, we first experimentally demonstrate the learning capabilities and predictable performance by a neuromorphic circuit composed of a nanoscale 1 kbit RRAM cross-point array of synapses and complementary metal-oxide-semiconductor neuron circuits. These developments open up possibilities for the development of ubiquitous ultra-dense, ultra-low-power cognitive computers.

  14. Electronic cleansing for computed tomography (CT) colonography using a scale-invariant three-material model.

    Science.gov (United States)

    Serlie, Iwo W O; Vos, Frans M; Truyen, Roel; Post, Frits H; Stoker, Jaap; van Vliet, Lucas J

    2010-06-01

    A well-known reading pitfall in computed tomography (CT) colonography is posed by artifacts at T-junctions, i.e., locations where air-fluid levels interface with the colon wall. This paper presents a scale-invariant method to determine material fractions in voxels near such T-junctions. The proposed electronic cleansing method particularly improves the segmentation at those locations. The algorithm takes a vector of Gaussian derivatives as input features. The measured features are made invariant to the orientation-dependent apparent scale of the data and normalized in a way to obtain equal noise variance. A so-called parachute model is introduced that maps Gaussian derivatives onto material fractions near T-junctions. Projection of the noisy derivatives onto the model yields improved estimates of the true, underlying feature values. The method is shown to render an accurate representation of the object boundary without artifacts near junctions. Therefore, it enhances the reading of CT colonography in a 3-D display mode.

  15. Distinguishing respirable quartz in coal fly ash using computer-controlled scanning electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Nick Cprek; Naresh Shah; Frank E. Huggins; Gerald P. Huffman [University of Kentucky, Lexington, KY (United States). Consortium for Fossil Fuel Science and Department of Chemical and Materials Engineering

    2007-05-15

    Determination and classification of quartz in coal fly ash (CFA) is a subject of interest because of the adverse health effects caused by inhalation of crystalline silica. Workers with prolonged exposure to this carcinogen can develop respiratory diseases over time. This obviously may include utility plant workers involved in the handling, loading, and hauling of CFA. In this investigation, computer-controlled scanning electron microscopy (CCSEM) and X-ray diffraction (XRD) were used to investigate Si-rich phases in CFA to develop a better approach for the determination of respirable quartz. Three CFA samples from utility boilers and a NIST glass standard CFA sample were investigated. The XRD measurements indicated that the four samples contained from 7.0 to 16.0 wt.% of quartz. The CCSEM measurements utilized both particle size distributions and a particle shape parameter, circularity, to classify the Si-rich phases in these ashes as either crystalline or amorphous (glass). The results indicated that the amount of free, respirable, quartz in these CFA samples ranged from only 0.1-1.0 vol % and showed little correlation with the XRD results for the bulk ash. These results are significant in view of the fact that XRD is the traditional method of measuring crystalline silica in dust collected from workplace atmospheres. The results provide a better understanding of studies that indicate very little evidence of a link between human exposure to CFA and silicosis and lung cancer. 24 refs., 8 figs., 4 tabs.

  16. Evolutionary developmental psychology

    National Research Council Canada - National Science Library

    King, Ashley C; Bjorklund, David F

    2010-01-01

    The field of evolutionary developmental psychology can potentially broaden the horizons of mainstream evolutionary psychology by combining the principles of Darwinian evolution by natural selection...

  17. Modeling and computations of the intramolecular electron transfer process in the two-heme protein cytochrome c4

    DEFF Research Database (Denmark)

    Natzmutdinov, Renat R.; Bronshtein, Michael D.; Zinkicheva, Tamara T.

    2012-01-01

    performed computational modeling of the intramolecular ET process by a combination of density functional theory (DFT) and quantum mechanical charge transfer theory to disclose reasons for this difference. We first address the electronic structures of the model heme core with histidine and methionine axial...... force were determined using dielectric continuum models. We then calculated the electronic transmission coefficient of the intramolecular ET rate using perturbation theory combined with the electronic wave functions determined by the DFT calculations for different heme group orientations and Fe......The di-heme protein Pseudomonas stutzeri cytochrome c4 (cyt c4) has emerged as a useful model for studying long-range protein electron transfer (ET). Recent experimental observations have shown a dramatically different pattern of intramolecular ET between the two heme groups in different local...

  18. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  19. Quantum computation with two-electron spins in semi-conductor quantum dots

    OpenAIRE

    Hiltunen, Tuukka

    2015-01-01

    A quantum computer would exploit the phenomena of quantum superposition and entanglement in its functioning and with them offer pathways to solving problems that are too hard or complex to even the best classical computers built today. The implementation of a large-scale working quantum computer could bring about a change in our society rivaling the one started by the digital computer. However, the field is still in its infancy and there are many theoretical and practical issues needing to be...

  20. Evolutionary Dynamics of Biological Games

    Science.gov (United States)

    Nowak, Martin A.; Sigmund, Karl

    2004-02-01

    Darwinian dynamics based on mutation and selection form the core of mathematical models for adaptation and coevolution of biological populations. The evolutionary outcome is often not a fitness-maximizing equilibrium but can include oscillations and chaos. For studying frequency-dependent selection, game-theoretic arguments are more appropriate than optimization algorithms. Replicator and adaptive dynamics describe short- and long-term evolution in phenotype space and have found applications ranging from animal behavior and ecology to speciation, macroevolution, and human language. Evolutionary game theory is an essential component of a mathematical and computational approach to biology.

  1. {sup 99m}Tc Auger electrons - Analysis on the effects of low absorbed doses by computational methods

    Energy Technology Data Exchange (ETDEWEB)

    Tavares, Adriana Alexandre S., E-mail: adriana_tavares@msn.co [Faculdade de Engenharia da Universidade do Porto (FEUP), Rua Dr. Roberto Frias, S/N, 4200-465 Porto (Portugal); Tavares, Joao Manuel R.S., E-mail: tavares@fe.up.p [Faculdade de Engenharia da Universidade do Porto (FEUP), Rua Dr. Roberto Frias, S/N, 4200-465 Porto (Portugal)

    2011-03-15

    We describe here the use of computational methods for evaluation of the low dose effects on human fibroblasts after irradiation with Technetium-99m ({sup 99m}Tc) Auger electrons. The results suggest a parabolic relationship between the irradiation of fibroblasts with {sup 99m}Tc Auger electrons and the total absorbed dose. Additionally, the results on very low absorbed doses may be explained by the bystander effect, which has been implicated on the cell's effects at low doses. Further in vitro evaluation will be worthwhile to clarify these findings.

  2. The Impacts of Attitudes and Engagement on Electronic Word of Mouth (eWOM of Mobile Sensor Computing Applications

    Directory of Open Access Journals (Sweden)

    Yu Zhao

    2016-03-01

    Full Text Available As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human–computer interaction (HCI issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users’ compound attitudes (i.e., affective attitudes, cognitive attitude, engagement and electronic word of mouth (eWOM behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user’s compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user—product (brand relationships.

  3. The Impacts of Attitudes and Engagement on Electronic Word of Mouth (eWOM) of Mobile Sensor Computing Applications.

    Science.gov (United States)

    Zhao, Yu; Liu, Yide; Lai, Ivan K W; Zhang, Hongfeng; Zhang, Yi

    2016-03-18

    As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human-computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users' compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user's compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user-product (brand) relationships.

  4. Importance of Accurate Computation of Secondary Electron Emission for ModelingSpacecraft Charging

    OpenAIRE

    Clerc, Sebastien; Dennison, JR

    2005-01-01

    The secondary electron yield is a critical process in establishing the charge balance in spacecraft charging and the subsequent determination of the equilibrium potential. Spacecraft charging codes use a parameterized expression for the secondary electron yield δ(Eo) as a function of incident electron energy, Eo. A critical step in accurately characterizing a particular spacecraft material is establishing the most efficient and accurate way to determine the fitting parameters in terms of the ...

  5. An initio computation of electron affinities of substituted benzalacetophenones (chalcones): a new approach to substituent effects in organic electrochemistry

    Energy Technology Data Exchange (ETDEWEB)

    Hicks, L.D.; Fry, A.J.; Kurzweil, V.C. [Wesleyan University, Middletown, CT (United States). Chemistry Dept.

    2004-12-15

    The electron affinities (EAs) of a training set of 29 monosubstituted benzalacetophenones (chalcones) were computed at the ab initio density functional B3LYP/6-31G level of theory. The EAs and experimental reduction potentials of the training set are highly linearly correlated (correlation coefficient of 0.969 and standard deviation of 10.8 mV). An additional 72 di-, tri-, and tetrasubstituted chalcones were then synthesized. Their reduction potentials were predicted from computed EAs using the linear correlation derived from the training set. Agreement between the experimental and computed reduction potentials is remarkably good, with a standard deviation of less than 22 mV for this very large set of substances whose potentials extend over a range of almost 700 mV. (author)

  6. Ab initio computation of electron affinities of substituted benzalacetophenones (chalcones): a new approach to substituent effects in organic electrochemistry

    Energy Technology Data Exchange (ETDEWEB)

    Hicks, L.D.; Fry, A.J.; Kurzweil, V.C. [Wesleyan Univ., Middletown, CT (United States). Hall-Atwater Lab.

    2004-12-15

    The electron affinities (EAs) of a training set of 29 monosubstituted benzalacetophenones (chalcones) were computed at the ab initio density functional B3LYP/6-31G* level of theory. The EAs and experimental reduction potentials of the training set are highly linearly correlated (correlation coefficient of 0.969 and standard deviation of 10.8 mV). An additional 72 di-, tri-, and tetrasubstituted chalcones were then synthesized. Their reduction potentials were predicted from computed EAs using the linear correlation derived from the training set. Agreement between the experimental and computed reduction potentials is remarkably good, with a standard deviation of less than 22 mV for this very large set of substances whose potentials extend over a range of almost 700 mV. (Author)

  7. The Electronic Mirror: Human-Computer Interaction and Change in Self-Appraisals.

    Science.gov (United States)

    De Laere, Kevin H.; Lundgren, David C.; Howe, Steven R.

    1998-01-01

    Compares humanlike versus machinelike interactional styles of computer interfaces, testing hypotheses that evaluative feedback conveyed through a humanlike interface will have greater impact on individuals' self-appraisals. Reflected appraisals were more influenced by computer feedback than were self-appraisals. Humanlike and machinelike interface…

  8. EMRlog Method for Computer Security for Electronic Medical Records with Logic and Data Mining

    Directory of Open Access Journals (Sweden)

    Sergio Mauricio Martínez Monterrubio

    2015-01-01

    Full Text Available The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.

  9. Parallel, distributed and GPU computing technologies in single-particle electron microscopy.

    Science.gov (United States)

    Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-07-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

  10. EMRlog method for computer security for electronic medical records with logic and data mining.

    Science.gov (United States)

    Martínez Monterrubio, Sergio Mauricio; Frausto Solis, Juan; Monroy Borja, Raúl

    2015-01-01

    The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.

  11. Evolutionary status of Polaris

    Science.gov (United States)

    Fadeyev, Yu. A.

    2015-05-01

    Hydrodynamic models of short-period Cepheids were computed to determine the pulsation period as a function of evolutionary time during the first and third crossings of the instability strip. The equations of radiation hydrodynamics and turbulent convection for radial stellar pulsations were solved with the initial conditions obtained from the evolutionary models of Population I stars (X = 0.7, Z = 0.02) with masses from 5.2 to 6.5 M⊙ and the convective core overshooting parameter 0.1 ≤ αov ≤ 0.3. In Cepheids with period of 4 d the rate of pulsation period change during the first crossing of the instability strip is over 50 times larger than that during the third crossing. Polaris is shown to cross the instability strip for the first time and to be the fundamental mode pulsator. The best agreement between the predicted and observed rates of period change was obtained for the model with mass of 5.4 M⊙ and the overshooting parameter αov = 0.25. The bolometric luminosity and radius are L = 1.26 × 103 L⊙ and R = 37.5 R⊙, respectively. In the HR diagram, Polaris is located at the red edge of the instability strip.

  12. Quantum mechanical computation of structural, electronic, and thermoelectric properties of AgSbSe2

    Directory of Open Access Journals (Sweden)

    M Salimi

    2015-07-01

    Full Text Available In this work, density functional calculations and Boltzmann semiclassical theory of transport are used to investigate structural, electronic, and thermoelectric properties of AgSbSe2 crystal. According to the published experimental measurements, five more likely structures of this compound are considered and their structural and electronic properties are calculated and compared together. Then, thermoelectric properties (electrical conductivity, electronic contribution to the thermal conductivity, power factor, and Seebeck coefficient of three more stable structures are investigated in the constant relaxation time approximation. Finally, the calculated temperature dependence of Seebeck coefficient is compared with the corresponding experimental measurements of others.

  13. Electron-beam lithographic computer-generated holograms designed by direct search coding algorithm

    Science.gov (United States)

    Tamura, Hitoshi; Torii, Yasuhiro

    2009-08-01

    An optimized encoding algorithm is required to produce high-quality computer generated holograms (CGH). For such purpose, we have proposed that usage of the direct search algorithm (DSA) is effective for encoding the Lohmann-type binary amplitude and phase CGH. However, it takes much time for a computation time to get an optical solution by a DSA. To solve this problem, we have newly found that simultaneously selective direct search algorithm (SDSA) is greatly effective to shorten a computing time for encoding a Lohmann-type CGH.

  14. A Novel Method for the Discrimination of Semen Arecae and Its Processed Products by Using Computer Vision, Electronic Nose, and Electronic Tongue

    Science.gov (United States)

    Xu, Min; Yang, Shi-Long; Peng, Wei; Liu, Yu-Jie; Xie, Da-Shuai; Li, Xin-Yi; Wu, Chun-Jie

    2015-01-01

    Areca nut, commonly known locally as Semen Arecae (SA) in China, has been used as an important Chinese herbal medicine for thousands of years. The raw SA (RAW) is commonly processed by stir-baking to yellow (SBY), stir-baking to dark brown (SBD), and stir-baking to carbon dark (SBC) for different clinical uses. In our present investigation, intelligent sensory technologies consisting of computer vision (CV), electronic nose (E-nose), and electronic tongue (E-tongue) were employed in order to develop a novel and accurate method for discrimination of SA and its processed products. Firstly, the color parameters and electronic sensory responses of E-nose and E-tongue of the samples were determined, respectively. Then, indicative components including 5-hydroxymethyl furfural (5-HMF) and arecoline (ARE) were determined by HPLC. Finally, principal component analysis (PCA) and discriminant factor analysis (DFA) were performed. The results demonstrated that these three instruments can effectively discriminate SA and its processed products. 5-HMF and ARE can reflect the stir-baking degree of SA. Interestingly, the two components showed close correlations to the color parameters and sensory responses of E-nose and E-tongue. In conclusion, this novel method based on CV, E-nose, and E-tongue can be successfully used to discriminate SA and its processed products. PMID:26366185

  15. Computational modeling of stabilizing the instability of a relativistic electron beam in a dense plasma

    Energy Technology Data Exchange (ETDEWEB)

    Bobylev, Yu. V.; Panin, V. A. [Tula Pedagogical University (Russian Federation); Kuzelev, M. V. [Moscow State University (Russian Federation); Rukhadze, A. A. [Russian Academy of Sciences, Prokhorov Institute of General Physics (Russian Federation)

    2011-12-15

    The nonlinear dynamics of the instability developed upon the interaction between a relativistic electron beam and a dense plasma as a function of the beam density is numerically modeled. The appropriate solutions are obtained and analyzed.

  16. SYSTEM DYNAMICS MODEL FOR EVALUATION OF REUSE OF ELECTRONIC WASTE ORIGINATED FROM PERSONAL COMPUTERS

    National Research Council Canada - National Science Library

    Eugênio Simonetto; Osvaldo Quelhas; Vesna Spasojević Brkić; Goran Putnik; Cátia Alves; Hélio Castro

    2016-01-01

    Information and Communication Technologies (ICT) are part of the day to day activities of a large part of world population, however its use involves a growing generation of electronic waste (ewaste...

  17. Transport of energetic electrons in solids computer simulation with applications to materials analysis and characterization

    CERN Document Server

    Dapor, Maurizio

    2017-01-01

    This new edition describes all the mechanisms of elastic and inelastic scattering of electrons with the atoms of the target as simple as possible. The use of techniques of quantum mechanics is described in detail for the investigation of interaction processes of electrons with matter. It presents the strategies of the Monte Carlo method, as well as numerous comparisons among the results of the simulations and the experimental data available in the literature. New in this edition is the description of the Mermin theory, a comparison between Mermin theory and Drude theory, a discussion about the dispersion laws, and details about the calculation of the phase shifts that are used in the relativistic partial wave expansion method. The role of secondary electrons in proton cancer therapy is discussed in the chapter devoted to applications. In this context, Monte Carlo results about the radial distribution of the energy deposited in PMMA by secondary electrons generated by energetic proton beams are presented.

  18. Towards Adaptive Evolutionary Architecture

    DEFF Research Database (Denmark)

    Bak, Sebastian HOlt; Rask, Nina; Risi, Sebastian

    2016-01-01

    This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how...... living in the future could occur, if built spaces could evolve and adapt alongside inhabitants. As such, present study explores the interdisciplinary possibilities in utilizing computational power to co-create with users and generate designs based on human input. We argue that this could lead...... to the development of designs tailored to the individual preferences of inhabitants, changing the roles of architects and designers entirely. Architecture-as-it-could-be is a philosophical approach conducted through artistic methods to anticipate the technological futures of human-centered development within...

  19. A comparative computational study of the electronic properties of planar and buckled silicene

    OpenAIRE

    Behera, Harihar; Mukhopadhyay, Gautam

    2012-01-01

    Using full potential density functional calculations within local density approximation (LDA), we report our investigation of the structural electronic properties of silicene (the graphene analogue of silicon), the strips of which has been synthesized recently on Ag(110) and Ag(100) surfaces. An assumed planar and an optimized buckled two dimensional (2D) hexagonal structures have been considered for comparisons of their electronic properties. Planar silicene shows a gapless band structure an...

  20. Theory and computation of few-electron atoms in intense laser fields

    CERN Document Server

    Moore, L

    2001-01-01

    experimental peak laser intensity measurement. At 780 nm preliminary results of a comparable calculation of double-ionization are given. In anticipation of a high intensity, high frequency radiation source becoming available in Germany by 2003, a calculation at 14 nm has also been performed. Momentum distributions have revealed the new process of double-electron above threshold ionization. In this process both electrons absorb excess photons during double-ionization. The study of the helium atom-exposed to an intense laser field forms the topic of this thesis. In the context of laser-atom interactions, a laser is said to be intense if the force it exerts on an electron in an atomic orbital is comparable to the force experienced by that electron due to the binding atomic potential. The electronic response of the helium atom to an intense laser field is governed by the interactions of the two electrons between themselves, with the nucleus and with the field. The problem therefore is the fundamental three-body p...

  1. Understanding Evolutionary Potential in Virtual CPU Instruction Set Architectures

    OpenAIRE

    David M Bryson; Charles Ofria

    2013-01-01

    We investigate fundamental decisions in the design of instruction set architectures for linear genetic programs that are used as both model systems in evolutionary biology and underlying solution representations in evolutionary computation. We subjected digital organisms with each tested architecture to seven different computational environments designed to present a range of evolutionary challenges. Our goal was to engineer a general purpose architecture that would be effective under a broad...

  2. Reconciliation of the cloud computing model with US federal electronic health record regulations

    OpenAIRE

    Schweitzer, Eugene J

    2011-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to tradition...

  3. Computer control of a scanning electron microscope for digital image processing of thermal-wave images

    Science.gov (United States)

    Gilbert, Percy; Jones, Robert E.; Kramarchuk, Ihor; Williams, Wallace D.; Pouch, John J.

    1987-01-01

    Using a recently developed technology called thermal-wave microscopy, NASA Lewis Research Center has developed a computer controlled submicron thermal-wave microscope for the purpose of investigating III-V compound semiconductor devices and materials. This paper describes the system's design and configuration and discusses the hardware and software capabilities. Knowledge of the Concurrent 3200 series computers is needed for a complete understanding of the material presented. However, concepts and procedures are of general interest.

  4. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form...... and enhance the global search ability. A large number of tests show that the proposed algorithm has higher convergence speed and better optimizing ability than quantum evolutionary algorithm, real-coded quantum evolutionary algorithm and hybrid quantum genetic algorithm. Tests also show that when chaos...... is introduced to quantum evolutionary algorithm, the hybrid chaotic search strategy is superior to the carrier chaotic strategy, and has better comprehensive performance than the chaotic mutation strategy in most of cases. Especially, the proposed algorithm is the only one that has 100% convergence rate in all...

  5. Late enhancement of the left ventricular myocardium in young patients with hypertrophic cardiomyopathy by electron beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kurosaki, Kenichi; Yoshibayashi, Muneo; Tsukano, Shinya; Ono, Yasuo; Arakaki, Yoshio; Naito, Hiroaki; Echigo, Shigeyuki [National Cardiovascular Center, Suita, Osaka (Japan)

    2001-05-01

    In the assessment of myocardial characteristics with computed tomography, late enhancement (intense stain in delayed phase image of contrast enhancement) is an abnormal finding and thought to represent fibrotic change. The purpose of this study was to investigate the clinical importance of late enhancement in young patients with hypertrophic cardiomyopathy. Forty-five patients with hypertrophic cardiomyopathy, aged 1 to 24 years, were examined by electron beam computed tomography. We also assessed the clinical data on these patients. Late enhancement was found in 29 (64%) patients, usually as a patchy, stained area in the myocardium. In 29 patients with late enhancement, seven (24%) has syncopal episode and seven (24%) had a family history of sudden death. In contrast, none (0%) of 16 patients without late enhancement had syncopal episode nor a family history of sudden death (p<0.05). Twenty-four hour electrocardiographic monitoring was performed for 31 patients. Al patients with ventricular tachycardia were in the group with late enhancement [10/23 (43%) vs 0/8 (0%), p<0.05]. Thirty-seven patients were examined by thallium scintigraphy. The perfusion defect was more frequently found in patients with late enhancement than in patients without [14/26 (54%) vs 2/11 (18%), p<0.05]. These data suggest that late enhancement shown with electron beam computed tomography is related to syncopal episode, family history of sudden death, ventricular tachycardia, and myocardial damage in young patients with hypertrophic cardiomyopathy. (author)

  6. PENELOPE, an algorithm and computer code for Monte Carlo simulation of electron-photon showers

    Energy Technology Data Exchange (ETDEWEB)

    Salvat, F.; Fernandez-Varea, J.M.; Baro, J.; Sempau, J.

    1996-07-01

    The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from 1 keV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. (Author) 108 refs.

  7. PENELOPE, and algorithm and computer code for Monte Carlo simulation of electron-photon showers

    Energy Technology Data Exchange (ETDEWEB)

    Salvat, F.; Fernandez-Varea, J.M.; Baro, J.; Sempau, J.

    1996-10-01

    The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from similar{sub t}o 1 KeV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm.

  8. Calculation of dipole polarizability derivatives of adamantane and their use in electron scattering computations

    DEFF Research Database (Denmark)

    Sauer, Stephan P. A.; Paidarová, Ivana; Čársky, Petr

    2016-01-01

    In this paper we present calculations of the static polarizability and its derivatives for the adamantane molecule carried out at the density functional theory level using the B3LYP exchange correlation functional and Sadlej’s polarized valence triple zeta basis set. It is shown that the polariza......In this paper we present calculations of the static polarizability and its derivatives for the adamantane molecule carried out at the density functional theory level using the B3LYP exchange correlation functional and Sadlej’s polarized valence triple zeta basis set. It is shown...... that the polarizability tensor is necessary to correct long-range behavior of DFT functionals used in electron-molecule scattering calculations. The impact of such a long-range correction is demonstrated on elastic and vibrationally inelastic electron collisions with adamantane, a molecule representing a large polyatomic...... target for electron scattering calculations....

  9. Evolutionary Sound Synthesis Controlled by Gestural Data

    Directory of Open Access Journals (Sweden)

    Jose Fornari

    2011-05-01

    Full Text Available This article focuses on the interdisciplinary research involving Computer Music and Generative Visual Art. We describe the implementation of two interactive artistic systems based on principles of Gestural Data (WILSON, 2002 retrieval and self-organization (MORONI, 2003, to control an Evolutionary Sound Synthesis method (ESSynth. The first implementation uses, as gestural data, image mapping of handmade drawings. The second one uses gestural data from dynamic body movements of dance. The resulting computer output is generated by an interactive system implemented in Pure Data (PD. This system uses principles of Evolutionary Computation (EC, which yields the generation of a synthetic adaptive population of sound objects. Considering that music could be seen as “organized sound” the contribution of our study is to develop a system that aims to generate "self-organized sound" – a method that uses evolutionary computation to bridge between gesture, sound and music.

  10. Computer assisted assembly of connectomes from electron micrographs: application to Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Meng Xu

    Full Text Available A rate-limiting step in determining a connectome, the set of all synaptic connections in a nervous system, is extraction of the relevant information from serial electron micrographs. Here we introduce a software application, Elegance, that speeds acquisition of the minimal dataset necessary, allowing the discovery of new connectomes. We have used Elegance to obtain new connectivity data in the nematode worm Caenorhabditis elegans. We analyze the accuracy that can be obtained, which is limited by unresolvable ambiguities at some locations in electron microscopic images. Elegance is useful for reconstructing connectivity in any region of neuropil of sufficiently small size.

  11. New method in computer simulations of electron and ion densities and temperatures in the plasmasphere and low-latitude ionosphere

    Directory of Open Access Journals (Sweden)

    A. V. Pavlov

    Full Text Available A new theoretical model of the Earth’s low- and mid-latitude ionosphere and plasmasphere has been developed. The new model uses a new method in ionospheric and plasmaspheric simulations which is a combination of the Eulerian and Lagrangian approaches in model simulations. The electron and ion continuity and energy equations are solved in a Lagrangian frame of reference which moves with an individual parcel of plasma with the local plasma drift velocity perpendicular to the magnetic and electric fields. As a result, only the time-dependent, one-dimension electron and ion continuity and energy equations are solved in this Lagrangian frame of reference. The new method makes use of an Eulerian computational grid which is fixed in space co-ordinates and chooses the set of the plasma parcels at every time step, so that all the plasma parcels arrive at points which are located between grid lines of the regularly spaced Eulerian computational grid at the next time step. The solution values of electron and ion densities Ne and Ni and temperatures Te and Ti at the Eulerian computational grid are obtained by interpolation. Equations which determine the trajectory of the ionospheric plasma perpendicular to magnetic field lines and take into account that magnetic field lines are "frozen" in the ionospheric plasma are derived and included in the new model. We have presented a comparison between the modeled NmF2 and hmF2 and NmF2 and hmF2 which were observed at the anomaly crest and close to the geomagnetic equator simultaneously by the Huancayo, Chiclayo, Talara, Bogota, Panama, and Puerto Rico ionospheric sounders during the 7 October 1957 geomagnetically quiet time period at solar maximum. The model calculations show that there is a need to revise the model local time dependence of the equatorial upward E × B drift velocity given by Scherliess and Fejer (1999 at solar maximum during quiet

  12. New method in computer simulations of electron and ion densities and temperatures in the plasmasphere and low-latitude ionosphere

    Directory of Open Access Journals (Sweden)

    A. V. Pavlov

    2003-07-01

    Full Text Available A new theoretical model of the Earth’s low- and mid-latitude ionosphere and plasmasphere has been developed. The new model uses a new method in ionospheric and plasmaspheric simulations which is a combination of the Eulerian and Lagrangian approaches in model simulations. The electron and ion continuity and energy equations are solved in a Lagrangian frame of reference which moves with an individual parcel of plasma with the local plasma drift velocity perpendicular to the magnetic and electric fields. As a result, only the time-dependent, one-dimension electron and ion continuity and energy equations are solved in this Lagrangian frame of reference. The new method makes use of an Eulerian computational grid which is fixed in space co-ordinates and chooses the set of the plasma parcels at every time step, so that all the plasma parcels arrive at points which are located between grid lines of the regularly spaced Eulerian computational grid at the next time step. The solution values of electron and ion densities Ne and Ni and temperatures Te and Ti at the Eulerian computational grid are obtained by interpolation. Equations which determine the trajectory of the ionospheric plasma perpendicular to magnetic field lines and take into account that magnetic field lines are "frozen" in the ionospheric plasma are derived and included in the new model. We have presented a comparison between the modeled NmF2 and hmF2 and NmF2 and hmF2 which were observed at the anomaly crest and close to the geomagnetic equator simultaneously by the Huancayo, Chiclayo, Talara, Bogota, Panama, and Puerto Rico ionospheric sounders during the 7 October 1957 geomagnetically quiet time period at solar maximum. The model calculations show that there is a need to revise the model local time dependence of the equatorial upward E × B drift velocity given by Scherliess and Fejer (1999 at solar maximum during quiet daytime equinox conditions. Uncertainties in the calculated Ni

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. Analogue alternative the electronic analogue computer in Britain and the USA, 1930-1975

    CERN Document Server

    Small, James S

    2013-01-01

    We are in the midst of a digital revolution - until recently, the majority of appliances used in everyday life have been developed with analogue technology. Now, either at home or out and about, we are surrounded by digital technology such as digital 'film', audio systems, computers and telephones. From the late 1940s until the 1970s, analogue technology was a genuine alternative to digital, and the two competing technologies ran parallel with each other. During this period, a community of engineers, scientists, academics and businessmen continued to develop and promote the analogue computer.

  15. Mechanism of surface morphology in electron beam melting of Ti6Al4V based on computational flow patterns

    Science.gov (United States)

    Ge, Wenjun; Han, Sangwoo; Fang, Yuchao; Cheon, Jason; Na, Suck Joo

    2017-10-01

    In this study, a 3D numerical model was proposed that uses the computational fluid dynamics (CFD) method to investigate molten pool formation in electron beam melting under different process parameters. Electron beam ray tracking was used to determine energy deposition in the powder bed model. The melt tracks obtained in this study can be divided into three categories: a balling pattern, distortion pattern and straight pattern. The 3D mesoscale model revealed that it is possible to obtain different molten pool temperature distributions, flow patterns and top surface morphologies using different process parameters. Detailed analysis was performed on the formation mechanism of both the balling defect and distortion pattern. The simulation results of the top surface morphology were also compared with experimental results and showed good agreement.

  16. EVALUATION OF COMPUTER-CONTROLLED SCANNING ELECTRON MICROSCOPY APPLIED TO AN AMBIENT URBAN AEROSOL SAMPLE

    Science.gov (United States)

    Recent interest in monitoring and speciation of particulate matter has led to increased application of scanning electron microscopy (SEM) coupled with energy-dispersive x-ray analysis (EDX) to individual particle analysis. SEM/EDX provides information on the size, shape, co...

  17. Computations on injection into organics : or how to let electrons shine

    NARCIS (Netherlands)

    Uijttewaal, Mattheus Adrianus

    2007-01-01

    This thesis studies various aspects of electron injection into organic light-emitting diodes (OLEDs) using density functional theory and the master equation approach (only the last chapter). The first part of the thesis studies the relation between the work function and the surface stability of a

  18. An analytical method for computing voxel S values for electrons and photons.

    Science.gov (United States)

    Amato, Ernesto; Minutoli, Fabio; Pacilio, Massimiliano; Campenni, Alfredo; Baldari, Sergio

    2012-11-01

    The use of voxel S values (VSVs) is perhaps the most common approach to radiation dosimetry for nonuniform distributions of activity within organs or tumors. However, VSVs are currently available only for a limited number of voxel sizes and radionuclides. The objective of this study was to develop a general method to evaluate them for any spectrum of electrons and photons in any cubic voxel dimension of practical interest for clinical dosimetry in targeted radionuclide therapy. The authors developed a Monte Carlo simulation in Geant4 in order to evaluate the energy deposited per disintegration (E(dep)) in a voxelized region of soft tissue from monoenergetic electrons (10-2000 keV) or photons (10-1000 keV) homogeneously distributed in the central voxel, considering voxel dimensions ranging from 3 mm to 10 mm. E(dep) was represented as a function of a dimensionless quantity termed the "normalized radius," R(n) = R∕l, where l is the voxel size and R is the distance from the origin. The authors introduced two parametric functions in order to fit the electron and photon results, and they interpolated the parameters to derive VSVs for any energy and voxel side within the ranges mentioned above. In order to validate the results, the authors determined VSV for two radionuclides ((131)I and (89)Sr) and two voxel dimensions and they compared them with reference data. A validation study in a simple sphere model, accounting for tissue inhomogeneities, is presented. The E(dep)(R(n)) for both monoenergetic electrons and photons exhibit a smooth variation with energy and voxel size, implying that VSVs for monoenergetic electrons or photons may be derived by interpolation over the range of energies and dimensions considered. By integration, S values for continuous emission spectra from β(-) decay may be derived as well. The approach allows the determination of VSVs for monoenergetic (Auger or conversion) electrons and (x-ray or gamma-ray) photons by means of two functions whose

  19. Optimized electron beam writing strategy for fabricating computer-generated holograms based on an effective medium approach.

    Science.gov (United States)

    Freese, Wiebke; Kämpfe, Thomas; Rockstroh, Werner; Kley, Ernst-Bernhard; Tünnermann, Andreas

    2011-04-25

    Recent research revealed that using the effective medium approach to generate arbitrary multi-phase level computer-generated holograms is a promising alternative to the conventional multi-height level approach. Although this method reduces the fabrication effort using one-step binary lithography, the subwavelength patterning process remains a huge challenge, particularly for large-scale applications. To reduce the writing time on variable shaped electron beam writing systems, an optimized strategy based on an appropriate reshaping of the binary subwavelength structures is illustrated. This strategy was applied to fabricate a three-phase level CGH in the visible range, showing promising experimental results.

  20. Substituent effects on the electron affinities and ionization energies of tria-, penta-, and heptafulvenes: a computational investigation

    DEFF Research Database (Denmark)

    Dahlstrand, Christian; Yamazaki, Kaoru; Kilså, Kristine

    2010-01-01

    The extent of substituent influence on the vertical electron affinities (EAs) and ionization energies (IEs) of 43 substituted tria-, penta-, and heptafulvenes was examined computationally at the OVGF/6-311G(d)//B3LYP/6-311G(d) level of theory and compared with those of tetracyanoquinodimethane...... of the EAs and IEs were rationalized by qualitative arguments based on frontier orbital symmetries for the different fulvene classes with either X or Y being constant. The minimum and maximum values found for the calculated EAs of the tria-, penta-, and heptafulvenes were 0.51-2.05, 0.24-3.63, and 0...

  1. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra.

    Science.gov (United States)

    Goings, Joshua J; Li, Xiaosong

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  2. Electronic cleansing for computed tomography (CT) colonography using a scale-invariant three-material model

    NARCIS (Netherlands)

    Serlie, Iwo W. O.; Vos, Frans M.; Truyen, Roel; Post, Frits H.; Stoker, Jaap; van Vliet, Lucas J.

    2010-01-01

    A well-known reading pitfall in computed tomography (CT) colonography is posed by artifacts at T-junctions, i.e., locations where air-fluid levels interface with the colon wall. This paper presents a scale-invariant method to determine material fractions in voxels near such T-junctions. The proposed

  3. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime

    NARCIS (Netherlands)

    Baart, T.A.; Eendebak, P.T.; Reichl, C.; Wegscheider, W.; Vandersypen, L.M.K.

    2016-01-01

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the

  4. Encoding neural and synaptic functionalities in electron spin: A pathway to efficient neuromorphic computing

    Science.gov (United States)

    Sengupta, Abhronil; Roy, Kaushik

    2017-12-01

    Present day computers expend orders of magnitude more computational resources to perform various cognitive and perception related tasks that humans routinely perform every day. This has recently resulted in a seismic shift in the field of computation where research efforts are being directed to develop a neurocomputer that attempts to mimic the human brain by nanoelectronic components and thereby harness its efficiency in recognition problems. Bridging the gap between neuroscience and nanoelectronics, this paper attempts to provide a review of the recent developments in the field of spintronic device based neuromorphic computing. Description of various spin-transfer torque mechanisms that can be potentially utilized for realizing device structures mimicking neural and synaptic functionalities is provided. A cross-layer perspective extending from the device to the circuit and system level is presented to envision the design of an All-Spin neuromorphic processor enabled with on-chip learning functionalities. Device-circuit-algorithm co-simulation framework calibrated to experimental results suggest that such All-Spin neuromorphic systems can potentially achieve almost two orders of magnitude energy improvement in comparison to state-of-the-art CMOS implementations.

  5. Investigation and Study of Computational Techniques for Design and Fabrication of Integrated Electronic Circuits

    Science.gov (United States)

    1975-09-01

    that this aspect nf TP ^ aspect of IC design can be accomplished on a computer prior to committing a new product to manufa -uring. This procedure...amount of CPU time (he claimed one week). For this reason, Motorola is presently working on this problem, but v/ith little success. It was stated that

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  9. Formalization and computation of quality measures based on electronic medical records.

    Science.gov (United States)

    Dentler, Kathrin; Numans, Mattijs E; ten Teije, Annette; Cornet, Ronald; de Keizer, Nicolette F

    2014-01-01

    Ambiguous definitions of quality measures in natural language impede their automated computability and also the reproducibility, validity, timeliness, traceability, comparability, and interpretability of computed results. Therefore, quality measures should be formalized before their release. We have previously developed and successfully applied a method for clinical indicator formalization (CLIF). The objective of our present study is to test whether CLIF is generalizable--that is, applicable to a large set of heterogeneous measures of different types and from various domains. We formalized the entire set of 159 Dutch quality measures for general practice, which contains structure, process, and outcome measures and covers seven domains. We relied on a web-based tool to facilitate the application of our method. Subsequently, we computed the measures on the basis of a large database of real patient data. Our CLIF method enabled us to fully formalize 100% of the measures. Owing to missing functionality, the accompanying tool could support full formalization of only 86% of the quality measures into Structured Query Language (SQL) queries. The remaining 14% of the measures required manual application of our CLIF method by directly translating the respective criteria into SQL. The results obtained by computing the measures show a strong correlation with results computed independently by two other parties. The CLIF method covers all quality measures after having been extended by an additional step. Our web tool requires further refinement for CLIF to be applied completely automatically. We therefore conclude that CLIF is sufficiently generalizable to be able to formalize the entire set of Dutch quality measures for general practice.

  10. Atomic bonding effects in annular dark field scanning transmission electron microscopy. I. Computational predictions

    Energy Technology Data Exchange (ETDEWEB)

    Odlyzko, Michael L.; Mkhoyan, K. Andre, E-mail: mkhoyan@umn.edu [Department of Chemical Engineering and Materials Science, University of Minnesota, Minneapolis, Minnesota 55455 (United States); Himmetoglu, Burak [Department of Chemical Engineering and Materials Science, University of Minnesota, Minneapolis, Minnesota 55455 and Materials Department, University of California, Santa Barbara, California 93106 (United States); Cococcioni, Matteo [Department of Chemical Engineering and Materials Science, University of Minnesota, Minneapolis, Minnesota 55455 and Theory and Simulations of Materials, National Centre for Computational Design and Discovery of Novel Materials, École polytechnique fédérale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland)

    2016-07-15

    Annular dark field scanning transmission electron microscopy (ADF-STEM) image simulations were performed for zone-axis-oriented light-element single crystals, using a multislice method adapted to include charge redistribution due to chemical bonding. Examination of these image simulations alongside calculations of the propagation of the focused electron probe reveal that the evolution of the probe intensity with thickness exhibits significant sensitivity to interatomic charge transfer, accounting for observed thickness-dependent bonding sensitivity of contrast in all ADF-STEM imaging conditions. Because changes in image contrast relative to conventional neutral atom simulations scale directly with the net interatomic charge transfer, the strongest effects are seen in crystals with highly polar bonding, while no effects are seen for nonpolar bonding. Although the bonding dependence of ADF-STEM image contrast varies with detector geometry, imaging parameters, and material temperature, these simulations predict the bonding effects to be experimentally measureable.

  11. Software abstractions and computational issues in parallel structure adaptive mesh methods for electronic structure calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Weare, J.; Ong, E.; Baden, S.

    1997-05-01

    We have applied structured adaptive mesh refinement techniques to the solution of the LDA equations for electronic structure calculations. Local spatial refinement concentrates memory resources and numerical effort where it is most needed, near the atomic centers and in regions of rapidly varying charge density. The structured grid representation enables us to employ efficient iterative solver techniques such as conjugate gradient with FAC multigrid preconditioning. We have parallelized our solver using an object- oriented adaptive mesh refinement framework.

  12. Evolutionary molecular medicine.

    Science.gov (United States)

    Nesse, Randolph M; Ganten, Detlev; Gregory, T Ryan; Omenn, Gilbert S

    2012-05-01

    Evolution has long provided a foundation for population genetics, but some major advances in evolutionary biology from the twentieth century that provide foundations for evolutionary medicine are only now being applied in molecular medicine. They include the need for both proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, competition between alleles, co-evolution, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are transforming evolutionary biology in ways that create even more opportunities for progress at its interfaces with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and related principles to speed the development of evolutionary molecular medicine.

  13. Self Assembled Semiconductor Quantum Dots for Spin Based All Optical and Electronic Quantum Computing

    Science.gov (United States)

    2008-04-17

    Bandyopadhyay, “Self Assembling Quantum Dots and Wires”, Encyclopedia of Nanoscience and Nanotechnology, Eds. James . A. Schwartz, Cristian Contescu and Karol...Mexico, October 29 - November 3, 2006. 10. M. Cahay, K. Garre, D. J. Lockwood, J. Frazer , B. Kanchibotla, S. Pramanik, S. Bandyopadhyay, V. Semet... George Mason and Virginia Commonwealth University), Newport News, VA, June 12, 2006 (plenary). 10. S. Bandyopadhyay, Computing, Detecting, Storing

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. Computational studies of the electronic, conductivities, and spectroscopic properties of hydrolysed Ru(II) anticancer complexes.

    Science.gov (United States)

    Adeniyi, Adebayo A; Ajibade, Peter A

    2013-11-01

    The mechanism of activation of metal-based anticancer agents was reported to be through hydrolysis. In this study, computational method was used to gain insight to the correlation between the chemistry of the hydrolysis and the anticancer activities of selected Ru(II)-based complexes. Interestingly, we observed that the mechanism of activation by hydrolysis and their consequential anticancer activities is associated with favourable thermodynamic changes, higher hyperpolarizability (β), lower band-gap and higher first-order net current. The Fermi contact (FC) and spin dipole (SD) are found to be the two most significant Ramsey terms that determine the spin-spin couplings (J(HZ)) of most of the existing bonds in the complexes. Many of the computed properties give insights into the change in the chemistry of the complexes due to hydrolysis. Besides strong correlations of the computed properties to the anticancer activities of the complexes, using the quantum theory of atoms in a molecule (QTAIM) to analyse the spectroscopic properties shows a stronger correlation between the spectroscopic properties of Ru atom to the reported anticancer activities than the sum over of the spectroscopic properties of all atoms in the complexes. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. An effective chaos-geometric computational approach to analysis and prediction of evolutionary dynamics of the environmental systems: Atmospheric pollution dynamics

    Science.gov (United States)

    Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Bunyakova, Yu Ya; Florko, T. A.; Agayar, E. V.; Solyanikova, E. P.

    2017-10-01

    The present paper concerns the results of computational studying dynamics of the atmospheric pollutants (dioxide of nitrogen, sulphur etc) concentrations in an atmosphere of the industrial cities (Odessa) by using the dynamical systems and chaos theory methods. A chaotic behaviour in the nitrogen dioxide and sulphurous anhydride concentration time series at several sites of the Odessa city is numerically investigated. As usually, to reconstruct the corresponding attractor, the time delay and embedding dimension are needed. The former is determined by the methods of autocorrelation function and average mutual information, and the latter is calculated by means of a correlation dimension method and algorithm of false nearest neighbours. Further, the Lyapunov’s exponents spectrum, Kaplan-Yorke dimension and Kolmogorov entropy are computed. It has been found an existence of a low-D chaos in the time series of the atmospheric pollutants concentrations.

  17. The role of electrostatics in TrxR electron transfer mechanism: A computational approach.

    Science.gov (United States)

    Teixeira, Vitor H; Capacho, Ana Sofia C; Machuqueiro, Miguel

    2016-12-01

    Thioredoxin reductase (TrxR) is an important enzyme in the control of the intracellular reduced redox environment. It transfers electrons from NADPH to several molecules, including its natural partner, thioredoxin. Although there is a generally accepted model describing how the electrons are transferred along TrxR, which involves a flexible arm working as a "shuttle," the molecular details of such mechanism are not completely understood. In this work, we use molecular dynamics simulations with Poisson-Boltzmann/Monte Carlo pKa calculations to investigate the role of electrostatics in the electron transfer mechanism. We observed that the combination of redox/protonation states of the N-terminal (FAD and Cys59/64) and C-terminal (Cys497/Selenocysteine498) redox centers defines the preferred relative positions and allows for the flexible arm to work as the desired "shuttle." Changing the redox/ionization states of those key players, leads to electrostatic triggers pushing the arm into the pocket when oxidized, and pulling it out, once it has been reduced. The calculated pKa values for Cys497 and Selenocysteine498 are 9.7 and 5.8, respectively, confirming that the selenocysteine is indeed deprotonated at physiological pH. This can be an important advantage in terms of reactivity (thiolate/selenolate are more nucleophilic than thiol/selenol) and ability to work as an electrostatic trigger (the "shuttle" mechanism) and may be the reason why TrxR uses selenium instead of sulfur. Proteins 2016; 84:1836-1843. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. High-resolution electron microscope and computed images of human tooth enamel crystals.

    Science.gov (United States)

    Brés, E F; Barry, J C; Hutchison, J L

    1985-03-01

    The structure of human enamel crystallites has been studied at a near atomic level by high-resolution electron microscopy. Electron micrographs have been obtained from crystallites present in human enamel with a structure resolution of 0.2 nm in the [0001], [1210], [1213], [1100] and [4510] zone axes directions. In most cases it was possible to match the experimental images with images calculated using the atomic positions of mineral hydroxyapatite. However, in some cases a discrepancy between calculated and experimental image detail was observed in the c direction of the [1210] and the [1100] images. This shows: (i) a structural heterogeneity of the crystals, and (ii) a loss of hexagonal symmetry of the structure. The resolution required to distinguish individual atomic sites in the different zones has been determined, and this will provide a useful basis for future work. As the determination of the "real structure" of biological crystals is of prime importance for the study of calcification mechanisms (crystal growth), biological properties and destructive phenomena of calcified tissues (i.e., dental caries and bone resorption).

  1. Computation of Electron Impact Ionization Cross sections of Iron Hydrogen Clusters - Relevance in Fusion Plasmas

    Science.gov (United States)

    Patel, Umang; Joshipura, K. N.

    2017-04-01

    Plasma-wall interaction (PWI) is one of the key issues in nuclear fusion research. In nuclear fusion devices, such as the JET tokamak or the ITER, first-wall materials will be directly exposed to plasma components. Erosion of first-wall materials is a consequence of the impact of hydrogen and its isotopes as main constituents of the hot plasma. Besides the formation of gas-phase atomic species in various charge states, di- and polyatomic molecular species are expected to be formed via PWI processes. These compounds may profoundly disturb the fusion plasma, may lead to unfavorable re-deposition of materials and composites in other areas of the vessel. Interaction between atoms, molecules as well transport of impurities are of interest for modelling of fusion plasma. Qion by electron impact are such process also important in low temperature plasma processing, astrophysics etc. We reported electron impact Qionfor iron hydrogen clusters, FeHn (n = 1 to 10) from ionization threshold to 2000 eV. A semi empirical approach called Complex Scattering Potential - Ionization Contribution (CSP-ic) has been employed for the reported calculation. In context of fusion relevant species Qion were reported for beryllium and its hydrides, tungsten and its oxides and cluster of beryllium-tungsten by Huber et al.. Iron hydrogen clusters are another such species whose Qion were calculated through DM and BEB formalisms, same has been compared with present calculations.

  2. Synthesis of Novel Derivatives of Carbazole-Thiophene, Their Electronic Properties, and Computational Studies

    Directory of Open Access Journals (Sweden)

    E. F. Damit

    2016-01-01

    Full Text Available A series of carbazole-thiophene dimers, P1–P9, were synthesized using Suzuki-Miyaura and Ullmann coupling reactions. In P1–P9, carbazole-thiophenes were linked at the N-9 position for different core groups via biphenyl, dimethylbiphenyl, and phenyl. Electronic properties were evaluated by UV-Vis, cyclic voltammogram, and theoretical calculations. Particularly, the effects of conjugation connectivity on photophysical and electrochemical properties, as well as the correlation between carbazole-thiophene and the core, were studied. Carbazole connecting with thiophenes at the 3,6-positions and the phenyl group as a core group leads to increased stabilization of HOMO and LUMO energy levels where the bandgap (ΔE is significantly reduced.

  3. High-order epistasis shapes evolutionary trajectories.

    Science.gov (United States)

    Sailer, Zachary R; Harms, Michael J

    2017-05-01

    High-order epistasis-where the effect of a mutation is determined by interactions with two or more other mutations-makes small, but detectable, contributions to genotype-fitness maps. While epistasis between pairs of mutations is known to be an important determinant of evolutionary trajectories, the evolutionary consequences of high-order epistasis remain poorly understood. To determine the effect of high-order epistasis on evolutionary trajectories, we computationally removed high-order epistasis from experimental genotype-fitness maps containing all binary combinations of five mutations. We then compared trajectories through maps both with and without high-order epistasis. We found that high-order epistasis strongly shapes the accessibility and probability of evolutionary trajectories. A closer analysis revealed that the magnitude of epistasis, not its order, predicts is effects on evolutionary trajectories. We further find that high-order epistasis makes it impossible to predict evolutionary trajectories from the individual and paired effects of mutations. We therefore conclude that high-order epistasis profoundly shapes evolutionary trajectories through genotype-fitness maps.

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  11. Modelling of pathologies of the nervous system by the example of computational and electronic models of elementary nervous systems

    Energy Technology Data Exchange (ETDEWEB)

    Shumilov, V. N., E-mail: vnshumilov@rambler.ru; Syryamkin, V. I., E-mail: maximus70sir@gmail.com; Syryamkin, M. V., E-mail: maximus70sir@gmail.com [National Research Tomsk State University, 634050, Tomsk, Lenin Avenue, 36 (Russian Federation)

    2015-11-17

    The paper puts forward principles of action of devices operating similarly to the nervous system and the brain of biological systems. We propose an alternative method of studying diseases of the nervous system, which may significantly influence prevention, medical treatment, or at least retardation of development of these diseases. This alternative is to use computational and electronic models of the nervous system. Within this approach, we represent the brain in the form of a huge electrical circuit composed of active units, namely, neuron-like units and connections between them. As a result, we created computational and electronic models of elementary nervous systems, which are based on the principles of functioning of biological nervous systems that we have put forward. Our models demonstrate reactions to external stimuli and their change similarly to the behavior of simplest biological organisms. The models possess the ability of self-training and retraining in real time without human intervention and switching operation/training modes. In our models, training and memorization take place constantly under the influence of stimuli on the organism. Training is without any interruption and switching operation modes. Training and formation of new reflexes occur by means of formation of new connections between excited neurons, between which formation of connections is physically possible. Connections are formed without external influence. They are formed under the influence of local causes. Connections are formed between outputs and inputs of two neurons, when the difference between output and input potentials of excited neurons exceeds a value sufficient to form a new connection. On these grounds, we suggest that the proposed principles truly reflect mechanisms of functioning of biological nervous systems and the brain. In order to confirm the correspondence of the proposed principles to biological nature, we carry out experiments for the study of processes of

  12. Modelling of pathologies of the nervous system by the example of computational and electronic models of elementary nervous systems

    Science.gov (United States)

    Shumilov, V. N.; Syryamkin, V. I.; Syryamkin, M. V.

    2015-11-01

    The paper puts forward principles of action of devices operating similarly to the nervous system and the brain of biological systems. We propose an alternative method of studying diseases of the nervous system, which may significantly influence prevention, medical treatment, or at least retardation of development of these diseases. This alternative is to use computational and electronic models of the nervous system. Within this approach, we represent the brain in the form of a huge electrical circuit composed of active units, namely, neuron-like units and connections between them. As a result, we created computational and electronic models of elementary nervous systems, which are based on the principles of functioning of biological nervous systems that we have put forward. Our models demonstrate reactions to external stimuli and their change similarly to the behavior of simplest biological organisms. The models possess the ability of self-training and retraining in real time without human intervention and switching operation/training modes. In our models, training and memorization take place constantly under the influence of stimuli on the organism. Training is without any interruption and switching operation modes. Training and formation of new reflexes occur by means of formation of new connections between excited neurons, between which formation of connections is physically possible. Connections are formed without external influence. They are formed under the influence of local causes. Connections are formed between outputs and inputs of two neurons, when the difference between output and input potentials of excited neurons exceeds a value sufficient to form a new connection. On these grounds, we suggest that the proposed principles truly reflect mechanisms of functioning of biological nervous systems and the brain. In order to confirm the correspondence of the proposed principles to biological nature, we carry out experiments for the study of processes of

  13. Mean-Potential Law in Evolutionary Games

    Science.gov (United States)

    Nałecz-Jawecki, Paweł; Miekisz, Jacek

    2018-01-01

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  14. Remembering the evolutionary Freud.

    Science.gov (United States)

    Young, Allan

    2006-03-01

    Throughout his career as a writer, Sigmund Freud maintained an interest in the evolutionary origins of the human mind and its neurotic and psychotic disorders. In common with many writers then and now, he believed that the evolutionary past is conserved in the mind and the brain. Today the "evolutionary Freud" is nearly forgotten. Even among Freudians, he is regarded to be a red herring, relevant only to the extent that he diverts attention from the enduring achievements of the authentic Freud. There are three ways to explain these attitudes. First, the evolutionary Freud's key work is the "Overview of the Transference Neurosis" (1915). But it was published at an inopportune moment, forty years after the author's death, during the so-called "Freud wars." Second, Freud eventually lost interest in the "Overview" and the prospect of a comprehensive evolutionary theory of psychopathology. The publication of The Ego and the Id (1923), introducing Freud's structural theory of the psyche, marked the point of no return. Finally, Freud's evolutionary theory is simply not credible. It is based on just-so stories and a thoroughly discredited evolutionary mechanism, Lamarckian use-inheritance. Explanations one and two are probably correct but also uninteresting. Explanation number three assumes that there is a fundamental difference between Freud's evolutionary narratives (not credible) and the evolutionary accounts of psychopathology that currently circulate in psychiatry and mainstream journals (credible). The assumption is mistaken but worth investigating.

  15. Efficiency and limitation of periodic sample multiplication to reduce computational load in Monte Carlo simulations of electron swarms in gas under attachment-dominated conditions

    Science.gov (United States)

    Sugawara, Hirotake

    2018-03-01

    In Monte Carlo simulations of electron swarms, sample electrons were copied periodically so that a sufficient number of samples are obtained in equilibrium after relaxation even under a severe attachment-dominated condition where most electrons vanish during the relaxation. The final sampling results were equivalent to those sampled by a conventional method, and the computational time conventionally wasted for the tracking of vanishing electrons was reduced drastically. The time saved can be utilized for tracking more samples to reduce statistical fluctuation. The efficiency of this technique and its limitation are discussed quantitatively together with details on its implementation.

  16. Computer simulation of electronic and magnetic properties of ternary chalcopyrites doped with transition metals

    Science.gov (United States)

    Krivosheeva, Anna V.; Shaposhnikov, Victor L.; Borisenko, Victor E.; Arnaud d'Avitaya, François; Lazzari, J.-L.

    2008-07-01

    Electronic and magnetic properties of BeSiAs2 and BeGeAs2 ternary compounds with chalcopyrite structure doped with transition metals (Mn, Cr) have been theoretically studied from the first principles. The influence of the substitutional positions of impurity atoms and their type on the appearance of a ferromagnetic (FM) or antiferromagnetic (AFM) state has been analyzed. It was found that magnetic moment of the systems does not depend strongly on the concentration and distance between impurity atoms, while the most important factors observed are the impurity type and substitution sites. Configurations with Mn atoms in the II-group sites are energetically stable in the AFM state, whereas Cr-doped ones seem to be in the FM state. Substitution of IV-group positions by both metals results preferably in the FM state, however these positions are not energetically favorable in comparison with II-group ones. The spin polarization of doped materials is evaluated and their possible application in spintronics is analyzed.

  17. Computer modeling and electron microscopy of silicon surfaces irradiated by cluster ion impacts

    CERN Document Server

    Insepov, Z; Santeufemio, C; Jones, K S; Yamada, I

    2003-01-01

    A hybrid molecular dynamics model has been applied for modeling impacts of Ar and decaborane clusters, with energies ranging from 25 to 1500 eV/atom, impacting Si surfaces. Crater formation, sputtering, and the shapes of craters and rims were studied. Our simulation predicts that on a Si(1 0 0), craters are nearly triangular in cross-section, with the facets directed along the close-packed (1 1 1) planes. The Si(1 0 0) craters exhibit four fold symmetry. The craters on Si(1 1 1) surface are well rounded in cross-section and the top-view shows a complicated six fold or triangular image. The simulation results for individual gas cluster impacts were compared with experiments at low dose (10 sup 1 sup 0 ions/cm sup 2 charge fluence) for Ar impacts into Si(1 0 0) and Si(1 1 1) substrate surfaces. Atomic force microscopy and cross-sectional high-resolution transmission electron microscope imaging of individual gas cluster ion impacts into Si(1 0 0) and Si(1 1 1) substrate surfaces revealed faceting properties of t...

  18. Structural and electronic properties of effective p-type doping WS2 monolayers: A computational study

    Science.gov (United States)

    Li, Ning; Liu, Zhengtang; Hu, Shengliang; Wang, Huiqi

    2018-01-01

    Using first-principles calculations within density functional theory, we systematically investigated the structural and electronic properties of Metal (Me = Al, Ga, In, Tl, V, Nb, Ta)-doped WS2 monolayers. The impurity states induced by Me substitutional doping are closed to the valence band, showing the p-type characteristic of Me-doped WS2 monolayers. Among Me dopants, Nb-doped WS2 monolayer has the lowest formation energy and slightly local distortion. Subsequently, the covalent character of Wsbnd S bond in Nb-doped WS2 monolayer increases compared with pure WS2 monolayer. It is noteworthy that the feature of direct band gap is still presented in (V-Ta)-doped WS2 monolayers, which is very conducive to microelectronic and optoelectronic applications. Therefore, Nb is the appropriate p-type dopant for the WS2 monolayers based on the present work. These findings may prove to be instrumental in the future design of new p-type conducive WS2 monolayers.

  19. Comparison of electron beam computed tomography and exercise electrocardiography in detecting coronary artery disease in the elderly

    Energy Technology Data Exchange (ETDEWEB)

    Inoue, Shinji; Mitsunami, Kenichi; Kinoshita, Masahiko [Shiga Univ. of Medical Science, Otsu (Japan)

    1998-08-01

    Although exercise electrocardiography (ECG) is a useful noninvasive screening test for coronary artery disease (CAD), one prerequisite for ECG screening is that patient be able to exercise enough to evoke myocardial ischemia. Thus, exercise ECG may not be suitable for, some elderly people with CAD who cannot exercise enough. We compared electron beam Computed Tomography (EBCT) with exercise ECG for detecting CAD in 196 patients (mean age, 58.4{+-}12.5 (standard deviation)) who had undergone coronary angiography. Using the angiographic findings as the ``gold standard``, we found that the sensitivity, specificity, positive predictive value, and negative predictive value were 88%, 77%, 89%, and 77%, respectively, for EBCT, and 66%, 72%, 83%, and 52%, respectively, for exercise ECG. Although the results were similar when the subjects were divided into different age groups, the negative predictive value for exercise ECG, among older patients was very low. These findings suggest that EBCT is superior to exercise ECG in detecting CAD in the elderly. (author)

  20. Web-based computational chemistry education with CHARMMing III: Reduction potentials of electron transfer proteins.

    Directory of Open Access Journals (Sweden)

    B Scott Perrin

    2014-07-01

    Full Text Available A module for fast determination of reduction potentials, E°, of redox-active proteins has been implemented in the CHARMM INterface and Graphics (CHARMMing web portal (www.charmming.org. The free energy of reduction, which is proportional to E°, is composed of an intrinsic contribution due to the redox site and an environmental contribution due to the protein and solvent. Here, the intrinsic contribution is selected from a library of pre-calculated density functional theory values for each type of redox site and redox couple, while the environmental contribution is calculated from a crystal structure of the protein using Poisson-Boltzmann continuum electrostatics. An accompanying lesson demonstrates a calculation of E°. In this lesson, an ionizable residue in a [4Fe-4S]-protein that causes a pH-dependent E° is identified, and the E° of a mutant that would test the identification is predicted. This demonstration is valuable to both computational chemistry students and researchers interested in predicting sequence determinants of E° for mutagenesis.

  1. A backward Monte Carlo method for efficient computation of runaway probabilities in runaway electron simulation

    Science.gov (United States)

    Zhang, Guannan; Del-Castillo-Negrete, Diego

    2017-10-01

    Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  4. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  5. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  9. Computational Biophysical, Biochemical, and Evolutionary Signature of Human R-Spondin Family Proteins, the Member of Canonical Wnt/β-Catenin Signaling Pathway

    Directory of Open Access Journals (Sweden)

    Ashish Ranjan Sharma

    2014-01-01

    Full Text Available In human, Wnt/β-catenin signaling pathway plays a significant role in cell growth, cell development, and disease pathogenesis. Four human (Rspos are known to activate canonical Wnt/β-catenin signaling pathway. Presently, (Rspos serve as therapeutic target for several human diseases. Henceforth, basic understanding about the molecular properties of (Rspos is essential. We approached this issue by interpreting the biochemical and biophysical properties along with molecular evolution of (Rspos thorough computational algorithm methods. Our analysis shows that signal peptide length is roughly similar in (Rspos family along with similarity in aa distribution pattern. In Rspo3, four N-glycosylation sites were noted. All members are hydrophilic in nature and showed alike GRAVY values, approximately. Conversely, Rspo3 contains the maximum positively charged residues while Rspo4 includes the lowest. Four highly aligned blocks were recorded through Gblocks. Phylogenetic analysis shows Rspo4 is being rooted with Rspo2 and similarly Rspo3 and Rspo1 have the common point of origin. Through phylogenomics study, we developed a phylogenetic tree of sixty proteins (n=60 with the orthologs and paralogs seed sequences. Protein-protein network was also illustrated. Results demonstrated in our study may help the future researchers to unfold significant physiological and therapeutic properties of (Rspos in various disease models.

  10. A P300-based brain-computer interface aimed at operating electronic devices at home for severely disabled people.

    Science.gov (United States)

    Corralejo, Rebeca; Nicolás-Alonso, Luis F; Alvarez, Daniel; Hornero, Roberto

    2014-10-01

    The present study aims at developing and assessing an assistive tool for operating electronic devices at home by means of a P300-based brain-computer interface (BCI). Fifteen severely impaired subjects participated in the study. The developed tool allows users to interact with their usual environment fulfilling their main needs. It allows for navigation through ten menus and to manage up to 113 control commands from eight electronic devices. Ten out of the fifteen subjects were able to operate the proposed tool with accuracy above 77 %. Eight out of them reached accuracies higher than 95 %. Moreover, bitrates up to 20.1 bit/min were achieved. The novelty of this study lies in the use of an environment control application in a real scenario: real devices managed by potential BCI end-users. Although impaired users might not be able to set up this system without aid of others, this study takes a significant step to evaluate the degree to which such populations could eventually operate a stand-alone system. Our results suggest that neither the type nor the degree of disability is a relevant issue to suitably operate a P300-based BCI. Hence, it could be useful to assist disabled people at home improving their personal autonomy.

  11. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  12. Proposal for an ad hoc computer network in the military electronic systems department at the military academy applying bluetooth technology

    Directory of Open Access Journals (Sweden)

    Miroslav R. Terzić

    2011-01-01

    Full Text Available The historical development of the Bluetooth module is given in the introduction of this paper. The importance of the Bluetooth standard for wireless connection on small distances is shown as well. The organization of the Department of Military Electronic Systems is presented with its area of duties, subordinate sections and deployment. The concept of a local area network for this Department, using Bluetooth technology, includes network topology and working regimes based on the main characteristics and technical specifications for the connection with Bluetooth technology. The Department's disperse computer network is proposed as a scatter net where one piconetwork includes the Head of Department and the Heads of Sections while other piconetworks are formed from the Heads of Sections and their subordinates. The security aspect of the presented network deals with basic computer network attack categories, protection methods and aspects. The paper concludes with some recommendations for the local area network using Bluetooth technology with respect to its economical and security aspects as well as to the managing principles of the Department.

  13. Experiments and Computational Theory for Electrical Breakdown in Critical Components: THz Imaging of Electronic Plasmas.

    Energy Technology Data Exchange (ETDEWEB)

    Zutavern, Fred J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hjalmarson, Harold P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bigman, Verle Howard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gallegos, Richard Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-11-01

    This report describes the development of ultra-short pulse laser (USPL) induced terahertz (THz) radiation to image electronic plasmas during electrical breakdown. The technique uses three pulses from two USPLs to (1) trigger the breakdown, (2) create a 2 picosecond (ps, 10 -12 s), THz pulse to illuminate the breakdown, and (3) record the THz image of the breakdown. During this three year internal research program, sub-picosecond jitter timing for the lasers, THz generation, high bandwidth (BW) diagnostics, and THz image acquisition was demonstrated. High intensity THz radiation was optically-induced in a pulse-charged gallium arsenide photoconductive switch. The radiation was collected, transported, concentrated, and co-propagated through an electro-optic crystal with an 800 nm USPL pulse whose polarization was rotated due to the spatially varying electric field of the THz image. The polarization modulated USPL pulse was then passed through a polarizer and the resulting spatially varying intensity was detected in a high resolution digital camera. Single shot images had a signal to noise of %7E3:1. Signal to noise was improved to %7E30:1 with several experimental techniques and by averaging the THz images from %7E4000 laser pulses internally and externally with the camera and the acquisition system (40 pulses per readout). THz shadows of metallic films and objects were also recorded with this system to demonstrate free-carrier absorption of the THz radiation and improve image contrast and resolution. These 2 ps THz pulses were created and resolved with 100 femtosecond (fs, 10 -15 s) long USPL pulses. Thus this technology has the capability to time-resolve extremely fast repetitive or single shot phenomena, such as those that occur during the initiation of electrical breakdown. The goal of imaging electrical breakdown was not reached during this three year project. However, plans to achieve this goal as part of a follow-on project are described in this document

  14. Evolutionary Biology Today

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 11. Evolutionary Biology Today - The Domain of Evolutionary Biology. Amitabh Joshi. Series Article Volume 7 Issue 11 November 2002 pp 8-17. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Evolutionary Biology Today

    Indian Academy of Sciences (India)

    Amitabh Joshi studies and teaches evolutionary ' genetics and population ecology at the Jawaharlal. Nehru Centre for Advanced. Scientific Research,. Bangalore. His current research interests are in life- history, evolution, the evolutionary genetics of biological clocks, the evolution of ecological specialization dynamics. He.

  16. Evolutionary Biology Today

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 2. Evolutionary Biology Today - What do Evolutionary Biologists do? Amitabh Joshi. Series Article Volume 8 Issue 2 February 2003 pp 6-18. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Applying evolutionary anthropology

    OpenAIRE

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also h...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. An evolutionary model with Turing machines

    CERN Document Server

    Feverati, Giovanni

    2007-01-01

    The development of a large non-coding fraction in eukaryotic DNA and the phenomenon of the code-bloat in the field of evolutionary computations show a striking similarity. This seems to suggest that (in the presence of mechanisms of code growth) the evolution of a complex code can't be attained without maintaining a large inactive fraction. To test this hypothesis we performed computer simulations of an evolutionary toy model for Turing machines, studying the relations among fitness and coding/non-coding ratio while varying mutation and code growth rates. The results suggest that, in our model, having a large reservoir of non-coding states constitutes a great (long term) evolutionary advantage.

  20. The evolutionary dynamics of language.

    Science.gov (United States)

    Steels, Luc; Szathmáry, Eörs

    2018-02-01

    The well-established framework of evolutionary dynamics can be applied to the fascinating open problems how human brains are able to acquire and adapt language and how languages change in a population. Schemas for handling grammatical constructions are the replicating unit. They emerge and multiply with variation in the brains of individuals and undergo selection based on their contribution to needed expressive power, communicative success and the reduction of cognitive effort. Adopting this perspective has two major benefits. (i) It makes a bridge to neurobiological models of the brain that have also adopted an evolutionary dynamics point of view, thus opening a new horizon for studying how human brains achieve the remarkably complex competence for language. And (ii) it suggests a new foundation for studying cultural language change as an evolutionary dynamics process. The paper sketches this novel perspective, provides references to empirical data and computational experiments, and points to open problems. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  1. The DEPOSIT computer code: Calculations of electron-loss cross-sections for complex ions colliding with neutral atoms

    Science.gov (United States)

    Litsarev, Mikhail S.

    2013-02-01

    A description of the DEPOSIT computer code is presented. The code is intended to calculate total and m-fold electron-loss cross-sections (m is the number of ionized electrons) and the energy T(b) deposited to the projectile (positive or negative ion) during a collision with a neutral atom at low and intermediate collision energies as a function of the impact parameter b. The deposited energy is calculated as a 3D integral over the projectile coordinate space in the classical energy-deposition model. Examples of the calculated deposited energies, ionization probabilities and electron-loss cross-sections are given as well as the description of the input and output data. Program summaryProgram title: DEPOSIT Catalogue identifier: AENP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 8726 No. of bytes in distributed program, including test data, etc.: 126650 Distribution format: tar.gz Programming language: C++. Computer: Any computer that can run C++ compiler. Operating system: Any operating system that can run C++. Has the code been vectorised or parallelized?: An MPI version is included in the distribution. Classification: 2.4, 2.6, 4.10, 4.11. Nature of problem: For a given impact parameter b to calculate the deposited energy T(b) as a 3D integral over a coordinate space, and ionization probabilities Pm(b). For a given energy to calculate the total and m-fold electron-loss cross-sections using T(b) values. Solution method: Direct calculation of the 3D integral T(b). The one-dimensional quadrature formula of the highest accuracy based upon the nodes of the Yacobi polynomials for the cosθ=x∈[-1,1] angular variable is applied. The Simpson rule for the φ∈[0,2π] angular variable is used. The Newton-Cotes pattern of the seventh order

  2. A Fast Evolutionary Metaheuristic for VRP with Time Windows

    NARCIS (Netherlands)

    Bräysy, Olli; Dullaert, Wout

    2003-01-01

    This paper presents a new evolutionary metaheuristic for the vehicle routing problem with time windiows. Ideas on multi-start local search, ejection chains, simulated annealing and evolutionary computation are combined in a heuristic that is both robust and efficient. the proposed method produces

  3. Scanning Electron Microscopy Analysis of the Adaptation of Single-Unit Screw-Retained Computer-Aided Design/Computer-Aided Manufacture Abutments After Mechanical Cycling.

    Science.gov (United States)

    Markarian, Roberto Adrian; Galles, Deborah Pedroso; Gomes França, Fabiana Mantovani

    2017-06-20

    To measure the microgap between dental implants and custom abutments fabricated using different computer-aided design/computer-aided manufacture (CAD/CAM) methods before and after mechanical cycling. CAD software (Dental System, 3Shape) was used to design a custom abutment for a single-unit, screw-retained crown compatible with a 4.1-mm external hexagon dental implant. The resulting stereolithography file was sent for manufacturing using four CAD/CAM methods (n = 40): milling and sintering of zirconium dioxide (ZO group), cobalt-chromium (Co-Cr) sintered via selective laser melting (SLM group), fully sintered machined Co-Cr alloy (MM group), and machined and sintered agglutinated Co-Cr alloy powder (AM group). Prefabricated titanium abutments (TI group) were used as controls. Each abutment was placed on a dental implant measuring 4.1× 11 mm (SA411, SIN) inserted into an aluminum block. Measurements were taken using scanning electron microscopy (SEM) (×4,000) on four regions of the implant-abutment interface (IAI) and at a relative distance of 90 degrees from each other. The specimens were mechanically aged (1 million cycles, 2 Hz, 100 N, 37°C) and the IAI width was measured again using the same approach. Data were analyzed using two-way analysis of variance, followed by the Tukey test. After mechanical cycling, the best adaptation results were obtained from the TI (2.29 ± 1.13 μm), AM (3.58 ± 1.80 μm), and MM (1.89 ± 0.98 μm) groups. A significantly worse adaptation outcome was observed for the SLM (18.40 ± 20.78 μm) and ZO (10.42 ± 0.80 μm) groups. Mechanical cycling had a marked effect only on the AM specimens, which significantly increased the microgap at the IAI. Custom abutments fabricated using fully sintered machined Co-Cr alloy and machined and sintered agglutinated Co-Cr alloy powder demonstrated the best adaptation results at the IAI, similar to those obtained with commercial prefabricated titanium abutments after mechanical cycling. The

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  5. QDist—Quartet Distance Between Evolutionary Trees

    DEFF Research Database (Denmark)

    Mailund; Pedersen, Christian N. Storm

    2004-01-01

    QDist is a program for computing the quartet distance between two unrooted evolutionary trees, i.e. the number of quartet topology differences between the two trees, where a quartet topology is the topological subtree induced by four species. The implementation is based on an algorithm with running...... time O(n log² n), which makes it practical to compare large trees....

  6. Incorporating evolutionary processes into population viability models

    NARCIS (Netherlands)

    Pierson, J.C.; Beissinger, S.R.; Bragg, J.G.; Coates, D.J.; Oostermeijer, J.G.B.; Sunnucks, P.; Schumaker, N.H.; Trotter, M.V.; Young, A.G.

    2015-01-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand

  7. Pulmonary CT image classification with evolutionary programming.

    Science.gov (United States)

    Madsen, M T; Uppaluri, R; Hoffman, E A; McLennan, G

    1999-12-01

    It is often difficult to classify information in medical images from derived features. The purpose of this research was to investigate the use of evolutionary programming as a tool for selecting important features and generating algorithms to classify computed tomographic (CT) images of the lung. Training and test sets consisting of 11 features derived from multiple lung CT images were generated, along with an indicator of the target area from which features originated. The images included five parameters based on histogram analysis, 11 parameters based on run length and co-occurrence matrix measures, and the fractal dimension. Two classification experiments were performed. In the first, the classification task was to distinguish between the subtle but known differences between anterior and posterior portions of transverse lung CT sections. The second classification task was to distinguish normal lung CT images from emphysematous images. The performance of the evolutionary programming approach was compared with that of three statistical classifiers that used the same training and test sets. Evolutionary programming produced solutions that compared favorably with those of the statistical classifiers. In separating the anterior from the posterior lung sections, the evolutionary programming results were better than two of the three statistical approaches. The evolutionary programming approach correctly identified all the normal and abnormal lung images and accomplished this by using less features than the best statistical method. The results of this study demonstrate the utility of evolutionary programming as a tool for developing classification algorithms.

  8. Theoretical and computational analysis of the membrane potential generated by cytochrome c oxidase upon single electron injection into the enzyme.

    Science.gov (United States)

    Sugitani, Ryogo; Medvedev, Emile S; Stuchebrukhov, Alexei A

    2008-09-01

    We have developed theory and the computational scheme for the analysis of the kinetics of the membrane potential generated by cytochrome c oxidase upon single electron injection into the enzyme. The theory allows one to connect the charge motions inside the enzyme to the membrane potential observed in the experiments by using data from the "dielectric topography" map of the enzyme that we have created. The developed theory is applied for the analysis of the potentiometric data recently reported by the Wikström group [I. Belevich, D.A. Bloch, N. Belevich, M. Wikström and M.I. Verkhovsky, Exploring the proton pump mechanism of cytochrome c oxidase in real time, Proc. Natl. Acad. Sci. U. S. A. 104 (2007) 2685-2690] on the O to E transition in Paracoccus denitrificans oxidase. Our analysis suggests, that the electron transfer to the binuclear center is coupled to a proton transfer (proton loading) to a group just "above" the binuclear center of the enzyme, from which the pumped proton is subsequently expelled by the chemical proton arriving to the binuclear center. The identity of the pump site could not be determined with certainty, but could be localized to the group of residues His326 (His291 in bovine), propionates of heme a(3), Arg 473/474, and Trp164. The analysis also suggests that the dielectric distance from the P-side to Fe a is 0.4 or larger. The difficulties and pitfalls of quantitative interpretation of potentiometric data are discussed.

  9. Computational simulation of the effects of oxygen on the electronic states of hydrogenated 3C-porous SiC.

    Science.gov (United States)

    Trejo, Alejandro; Calvino, Marbella; Ramos, Estrella; Cruz-Irisson, Miguel

    2012-08-22

    A computational study of the dependence of the electronic band structure and density of states on the chemical surface passivation of cubic porous silicon carbide (pSiC) was performed using ab initio density functional theory and the supercell method. The effects of the porosity and the surface chemistry composition on the energetic stability of pSiC were also investigated. The porous structures were modeled by removing atoms in the [001] direction to produce two different surface chemistries: one fully composed of silicon atoms and one composed of only carbon atoms. The changes in the electronic states of the porous structures as a function of the oxygen (O) content at the surface were studied. Specifically, the oxygen content was increased by replacing pairs of hydrogen (H) atoms on the pore surface with O atoms attached to the surface via either a double bond (X = O) or a bridge bond (X-O-X, X = Si or C). The calculations show that for the fully H-passivated surfaces, the forbidden energy band is larger for the C-rich phase than for the Si-rich phase. For the partially oxygenated Si-rich surfaces, the band gap behavior depends on the O bond type. The energy gap increases as the number of O atoms increases in the supercell if the O atoms are bridge-bonded, whereas the band gap energy does not exhibit a clear trend if O is double-bonded to the surface. In all cases, the gradual oxygenation decreases the band gap of the C-rich surface due to the presence of trap-like states.

  10. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  11. Molluscan Evolutionary Genomics

    Energy Technology Data Exchange (ETDEWEB)

    Simison, W. Brian; Boore, Jeffrey L.

    2005-12-01

    In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the early 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.

  12. Eco-evolutionary feedbacks, adaptive dynamics and evolutionary rescue theory.

    Science.gov (United States)

    Ferriere, Regis; Legendre, Stéphane

    2013-01-19

    Adaptive dynamics theory has been devised to account for feedbacks between ecological and evolutionary processes. Doing so opens new dimensions to and raises new challenges about evolutionary rescue. Adaptive dynamics theory predicts that successive trait substitutions driven by eco-evolutionary feedbacks can gradually erode population size or growth rate, thus potentially raising the extinction risk. Even a single trait substitution can suffice to degrade population viability drastically at once and cause 'evolutionary suicide'. In a changing environment, a population may track a viable evolutionary attractor that leads to evolutionary suicide, a phenomenon called 'evolutionary trapping'. Evolutionary trapping and suicide are commonly observed in adaptive dynamics models in which the smooth variation of traits causes catastrophic changes in ecological state. In the face of trapping and suicide, evolutionary rescue requires that the population overcome evolutionary threats generated by the adaptive process itself. Evolutionary repellors play an important role in determining how variation in environmental conditions correlates with the occurrence of evolutionary trapping and suicide, and what evolutionary pathways rescue may follow. In contrast with standard predictions of evolutionary rescue theory, low genetic variation may attenuate the threat of evolutionary suicide and small population sizes may facilitate escape from evolutionary traps.

  13. Evolutionary Mechanisms for Loneliness

    Science.gov (United States)

    Cacioppo, John T.; Cacioppo, Stephanie; Boomsma, Dorret I.

    2013-01-01

    Robert Weiss (1973) conceptualized loneliness as perceived social isolation, which he described as a gnawing, chronic disease without redeeming features. On the scale of everyday life, it is understandable how something as personally aversive as loneliness could be regarded as a blight on human existence. However, evolutionary time and evolutionary forces operate at such a different scale of organization than we experience in everyday life that personal experience is not sufficient to understand the role of loneliness in human existence. Research over the past decade suggests a very different view of loneliness than suggested by personal experience, one in which loneliness serves a variety of adaptive functions in specific habitats. We review evidence on the heritability of loneliness and outline an evolutionary theory of loneliness, with an emphasis on its potential adaptive value in an evolutionary timescale. PMID:24067110

  14. Evolutionary behavioral genetics.

    Science.gov (United States)

    Zietsch, Brendan P; de Candia, Teresa R; Keller, Matthew C

    2015-04-01

    We describe the scientific enterprise at the intersection of evolutionary psychology and behavioral genetics-a field that could be termed Evolutionary Behavioral Genetics-and how modern genetic data is revolutionizing our ability to test questions in this field. We first explain how genetically informative data and designs can be used to investigate questions about the evolution of human behavior, and describe some of the findings arising from these approaches. Second, we explain how evolutionary theory can be applied to the investigation of behavioral genetic variation. We give examples of how new data and methods provide insight into the genetic architecture of behavioral variation and what this tells us about the evolutionary processes that acted on the underlying causal genetic variants.

  15. Marine mammals: evolutionary biology

    National Research Council Canada - National Science Library

    Berta, Annalisa; Sumich, James L; Kovacs, Kit M

    2015-01-01

    The third edition of Marine Mammals: Evolutionary Biology provides a comprehensive and current assessment of the diversity, evolution, and biology of marine mammals, while highlighting the latest tools and techniques for their study...

  16. Genomes, Phylogeny, and Evolutionary Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Monica

    2005-03-25

    With the completion of the human genome and the growing number of diverse genomes being sequenced, a new age of evolutionary research is currently taking shape. The myriad of technological breakthroughs in biology that are leading to the unification of broad scientific fields such as molecular biology, biochemistry, physics, mathematics and computer science are now known as systems biology. Here I present an overview, with an emphasis on eukaryotes, of how the postgenomics era is adopting comparative approaches that go beyond comparisons among model organisms to shape the nascent field of evolutionary systems biology.

  17. An evolutionary approach to military history

    Directory of Open Access Journals (Sweden)

    Xabier Rubio Campillo

    2014-01-01

    Full Text Available This paper provides a new way of analysing the concept of change within the field of military history. The proposal is based on the use of complex adaptive systems and evolutionary theory. We introduce the concepts of selection, adaptation and coevolution to explain how war is managed in different societies, and game theory to explore decision-making processes of commanders. We emphasize the value of integrating formal modeling and computational simulations in order to apply the approach to real case studies. Our conclusions outline the advantages of an evolutionary military history in the difficult task of understanding the causes of transformation in past battlefields and armies.

  18. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks; Diseno de una herramienta de computo para la espectrometria y dosimetria de neutrones por medio de redes neuronales evolutivas

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Martinez B, M. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Gallego, E. [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, Jose Gutierrez Abascal No. 2, E-28006 Madrid (Spain)], e-mail: morvymmyahoo@com.mx

    2009-10-15

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  19. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  20. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    the cost of a neighbourhood of schedules. The scheduling algorithm attempts to find a small set of schedules with an acceptable level of performance. The approach is demonstrated to significantly improve the robustness and flexibility of the schedules while at the same time producing schedules with a low...

  1. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  2. Using technology to support investigations in the electronic age: tracking hackers to large scale international computer fraud

    Science.gov (United States)

    McFall, Steve

    1994-03-01

    With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.

  3. A framework for evolutionary systems biology.

    Science.gov (United States)

    Loewe, Laurence

    2009-02-24

    Many difficult problems in evolutionary genomics are related to mutations that have weak effects on fitness, as the consequences of mutations with large effects are often simple to predict. Current systems biology has accumulated much data on mutations with large effects and can predict the properties of knockout mutants in some systems. However experimental methods are too insensitive to observe small effects. Here I propose a novel framework that brings together evolutionary theory and current systems biology approaches in order to quantify small effects of mutations and their epistatic interactions in silico. Central to this approach is the definition of fitness correlates that can be computed in some current systems biology models employing the rigorous algorithms that are at the core of much work in computational systems biology. The framework exploits synergies between the realism of such models and the need to understand real systems in evolutionary theory. This framework can address many longstanding topics in evolutionary biology by defining various 'levels' of the adaptive landscape. Addressed topics include the distribution of mutational effects on fitness, as well as the nature of advantageous mutations, epistasis and robustness. Combining corresponding parameter estimates with population genetics models raises the possibility of testing evolutionary hypotheses at a new level of realism. EvoSysBio is expected to lead to a more detailed understanding of the fundamental principles of life by combining knowledge about well-known biological systems from several disciplines. This will benefit both evolutionary theory and current systems biology. Understanding robustness by analysing distributions of mutational effects and epistasis is pivotal for drug design, cancer research, responsible genetic engineering in synthetic biology and many other practical applications.

  4. A framework for evolutionary systems biology

    Directory of Open Access Journals (Sweden)

    Loewe Laurence

    2009-02-01

    Full Text Available Abstract Background Many difficult problems in evolutionary genomics are related to mutations that have weak effects on fitness, as the consequences of mutations with large effects are often simple to predict. Current systems biology has accumulated much data on mutations with large effects and can predict the properties of knockout mutants in some systems. However experimental methods are too insensitive to observe small effects. Results Here I propose a novel framework that brings together evolutionary theory and current systems biology approaches in order to quantify small effects of mutations and their epistatic interactions in silico. Central to this approach is the definition of fitness correlates that can be computed in some current systems biology models employing the rigorous algorithms that are at the core of much work in computational systems biology. The framework exploits synergies between the realism of such models and the need to understand real systems in evolutionary theory. This framework can address many longstanding topics in evolutionary biology by defining various 'levels' of the adaptive landscape. Addressed topics include the distribution of mutational effects on fitness, as well as the nature of advantageous mutations, epistasis and robustness. Combining corresponding parameter estimates with population genetics models raises the possibility of testing evolutionary hypotheses at a new level of realism. Conclusion EvoSysBio is expected to lead to a more detailed understanding of the fundamental principles of life by combining knowledge about well-known biological systems from several disciplines. This will benefit both evolutionary theory and current systems biology. Understanding robustness by analysing distributions of mutational effects and epistasis is pivotal for drug design, cancer research, responsible genetic engineering in synthetic biology and many other practical applications.

  5. Proteomics in evolutionary ecology.

    Science.gov (United States)

    Baer, B; Millar, A H

    2016-03-01

    Evolutionary ecologists are traditionally gene-focused, as genes propagate phenotypic traits across generations and mutations and recombination in the DNA generate genetic diversity required for evolutionary processes. As a consequence, the inheritance of changed DNA provides a molecular explanation for the functional changes associated with natural selection. A direct focus on proteins on the other hand, the actual molecular agents responsible for the expression of a phenotypic trait, receives far less interest from ecologists and evolutionary biologists. This is partially due to the central dogma of molecular biology that appears to define proteins as the 'dead-end of molecular information flow' as well as technical limitations in identifying and studying proteins and their diversity in the field and in many of the more exotic genera often favored in ecological studies. Here we provide an overview of a newly forming field of research that we refer to as 'Evolutionary Proteomics'. We point out that the origins of cellular function are related to the properties of polypeptide and RNA and their interactions with the environment, rather than DNA descent, and that the critical role of horizontal gene transfer in evolution is more about coopting new proteins to impact cellular processes than it is about modifying gene function. Furthermore, post-transcriptional and post-translational processes generate a remarkable diversity of mature proteins from a single gene, and the properties of these mature proteins can also influence inheritance through genetic and perhaps epigenetic mechanisms. The influence of post-transcriptional diversification on evolutionary processes could provide a novel mechanistic underpinning for elements of rapid, directed evolutionary changes and adaptations as observed for a variety of evolutionary processes. Modern state-of the art technologies based on mass spectrometry are now available to identify and quantify peptides, proteins, protein

  6. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  7. Paleoanthropology and evolutionary theory.

    Science.gov (United States)

    Tattersall, Ian

    2012-01-01

    Paleoanthropologists of the first half of the twentieth century were little concerned either with evolutionary theory or with the technicalities and broader implications of zoological nomenclature. In consequence, the paleoanthropological literature of the period consisted largely of a series of descriptions accompanied by authoritative pronouncements, together with a huge excess of hominid genera and species. Given the intellectual flimsiness of the resulting paleoanthropological framework, it is hardly surprising that in 1950 the ornithologist Ernst Mayr met little resistance when he urged the new postwar generation of paleoanthropologists to accept not only the elegant reductionism of the Evolutionary Synthesis but a vast oversimplification of hominid phylogenetic history and nomenclature. Indeed, the impact of Mayr's onslaught was so great that even when developments in evolutionary biology during the last quarter of the century brought other paleontologists to the realization that much more has been involved in evolutionary histories than the simple action of natural selection within gradually transforming lineages, paleoanthropologists proved highly reluctant to follow. Even today, paleoanthropologists are struggling to reconcile an intuitive realization that the burgeoning hominid fossil record harbors a substantial diversity of species (bringing hominid evolutionary patterns into line with that of other successful mammalian families), with the desire to cram a huge variety of morphologies into an unrealistically minimalist systematic framework. As long as this theoretical ambivalence persists, our perception of events in hominid phylogeny will continue to be distorted.

  8. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.

  9. Language as an evolutionary system

    Science.gov (United States)

    Brighton, Henry; Smith, Kenny; Kirby, Simon

    2005-09-01

    John Maynard Smith and Eörs Szathmáry argued that human language signified the eighth major transition in evolution: human language marked a new form of information transmission from one generation to another [Maynard Smith J, Szathmáry E. The major transitions in evolution. Oxford: Oxford Univ. Press; 1995]. According to this view language codes cultural information and as such forms the basis for the evolution of complexity in human culture. In this article we develop the theory that language also codes information in another sense: languages code information on their own structure. As a result, languages themselves provide information that influences their own survival. To understand the consequences of this theory we discuss recent computational models of linguistic evolution. Linguistic evolution is the process by which languages themselves evolve. This article draws together this recent work on linguistic evolution and highlights the significance of this process in understanding the evolution of linguistic complexity. Our conclusions are that: (1) the process of linguistic transmission constitutes the basis for an evolutionary system, and (2), that this evolutionary system is only superficially comparable to the process of biological evolution.

  10. Evolutionary development of tensegrity structures.

    Science.gov (United States)

    Lobo, Daniel; Vico, Francisco J

    2010-09-01

    Contributions from the emerging fields of molecular genetics and evo-devo (evolutionary developmental biology) are greatly benefiting the field of evolutionary computation, initiating a promise of renewal in the traditional methodology. While direct encoding has constituted a dominant paradigm, indirect ways to encode the solutions have been reported, yet little attention has been paid to the benefits of the proposed methods to real problems. In this work, we study the biological properties that emerge by means of using indirect encodings in the context of form-finding problems. A novel indirect encoding model for artificial development has been defined and applied to an engineering structural-design problem, specifically to the discovery of tensegrity structures. This model has been compared with a direct encoding scheme. While the direct encoding performs similarly well to the proposed method, indirect-based results typically outperform the direct-based results in aspects not directly linked to the nature of the problem itself, but to the emergence of properties found in biological organisms, like organicity, generalization capacity, or modularity aspects which are highly valuable in engineering. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Archaeogenetics in evolutionary medicine.

    Science.gov (United States)

    Bouwman, Abigail; Rühli, Frank

    2016-09-01

    Archaeogenetics is the study of exploration of ancient DNA (aDNA) of more than 70 years old. It is an important part of the wider studies of many different areas of our past, including animal, plant and pathogen evolution and domestication events. Hereby, we address specifically the impact of research in archaeogenetics in the broader field of evolutionary medicine. Studies on ancient hominid genomes help to understand even modern health patterns. Human genetic microevolution, e.g. related to abilities of post-weaning milk consumption, and specifically genetic adaptation in disease susceptibility, e.g. towards malaria and other infectious diseases, are of the upmost importance in contributions of archeogenetics on the evolutionary understanding of human health and disease. With the increase in both the understanding of modern medical genetics and the ability to deep sequence ancient genetic information, the field of archaeogenetic evolutionary medicine is blossoming.

  12. Evolutionary synthetic biology.

    Science.gov (United States)

    Peisajovich, Sergio G

    2012-06-15

    Signaling networks process vast amounts of environmental information to generate specific cellular responses. As cellular environments change, signaling networks adapt accordingly. Here, I will discuss how the integration of synthetic biology and directed evolution approaches is shedding light on the molecular mechanisms that guide the evolution of signaling networks. In particular, I will review studies that demonstrate how different types of mutations, from the replacement of individual amino acids to the shuffling of modular domains, lead to markedly different evolutionary trajectories and consequently to diverse network rewiring. Moreover, I will argue that intrinsic evolutionary properties of signaling proteins, such as the robustness of wild type functions, the promiscuous nature of evolutionary intermediates, and the modular decoupling between binding and catalysis, play important roles in the evolution of signaling networks. Finally, I will argue that rapid advances in our ability to synthesize DNA will radically alter how we study signaling network evolution at the genome-wide level.

  13. Total Integrative Evolutionary Communication

    DEFF Research Database (Denmark)

    Nedergaard Thomsen, Ole; Brier, Søren

    2014-01-01

    In this paper we outline a cybersemiotic foundation for the trend of pragmatics-based functional linguistics, Functional Discourse Grammar. Cybersemiotics is a substantial inter- and transdisciplinary semiotic theory which integrates, on the one hand, second-order cybernetics and autopoiesis theory...... and, on the other, Peircean biosemiotics. According to Cybersemiotics, language is primarily a creative process of total integrative evolutionary communication. It comprises three evolutionary stages: (1) biological reflexive languaging (the reflexive foundation of social coordination), (2......). In this inclusive hierarchy language games subsume the other stages, and thus human evolutionary communication is primarily a symbolic-conventional practice. It is intertwined with the practice of living, that is, with different life forms, including other forms of semiotic behavior. Together they form a coherent...

  14. Teaching evolutionary biology

    Directory of Open Access Journals (Sweden)

    Rosana Tidon

    2004-01-01

    Full Text Available Evolutionary Biology integrates several disciplines of Biology in a complex and interactive manner, where a deep understanding of the subject demands knowledge in diverse areas. Since this knowledge is often inaccessible to the majority of specialized professionals, including the teachers, we present some reflections in order to stimulate discussions aimed at the improvement of the conditions of education in this area. We examine the profile of evolutionary teaching in Brazil, based on questionnaires distributed to teachers in Secondary Education in the Federal District, on data provided by the "National Institute for Educational Studies and Research", and on information collected from teachers working in various regions of this country. Issues related to biological misconceptions, curriculum and didactic material are discussed, and some proposals are presented with the objective of aiding discussions aimed at the improvement of the teaching of evolutionary biology.

  15. Assessment of W1 and W2 theories for the computation of electron affinities, ionization potentials, heats of formation, and proton affinities

    OpenAIRE

    Parthiban, S.; Martin, Jan M. L.

    2001-01-01

    The performance of two recent {\\em ab initio} computational thermochemistry schemes, W1 and W2 theory [J.M.L. Martin and G. de Oliveira, J. Chem. Phys. 111, 1843 (1999}], is assessed for an enlarged sample of thermochemical data consisting of the ionization potentials and electron affinities in the G2-1 and G2-2 sets, as well as the heats of formation in the G2-1 and a subset of the G2-2 set. We find W1 theory to be several times more accurate for ionization potentials and electron affinities...

  16. Evolutionary Design in Art

    Science.gov (United States)

    McCormack, Jon

    Evolution is one of the most interesting and creative processes we currently understand, so it should come as no surprise that artists and designers are embracing the use of evolution in problems of artistic creativity. The material in this section illustrates the diversity of approaches being used by artists and designers in relation to evolution at the boundary of art and science. While conceptualising human creativity as an evolutionary process in itself may be controversial, what is clear is that evolutionary processes can be used to complement, even enhance human creativity, as the chapters in this section aptly demonstrate.

  17. EVOLUTIONARY FOUNDATIONS FOR MOLECULAR MEDICINE

    Science.gov (United States)

    Nesse, Randolph M.; Ganten, Detlev; Gregory, T. Ryan; Omenn, Gilbert S.

    2015-01-01

    Evolution has long provided a foundation for population genetics, but many major advances in evolutionary biology from the 20th century are only now being applied in molecular medicine. They include the distinction between proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are further transforming evolutionary biology and creating yet more opportunities for progress at the interface of evolution with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and others to speed the development of evolutionary molecular medicine. PMID:22544168

  18. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  19. Creating Electronic Books-Chapters for Computers and Tablets Using Easy Java/JavaScript Simulations, EjsS Modeling Tool

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    This paper shares my journey (tools used, design principles derived and modeling pedagogy implemented) when creating electronic books-chapters (epub3 format) for computers and tablets using Easy Java/JavaScript Simulations, (old name EJS, new EjsS) Modeling Tool. The theory underpinning this work grounded on learning by doing through dynamic and interactive simulation-models that can be more easily made sense of instead of the static nature of printed materials. I started combining related computer models with supporting texts and illustrations into a coherent chapter, a logical next step towards tighter support for teachers and students ,developing prototypes electronic chapters on the topics of Simple Harmonic Motion and Gravity customized for the Singapore-Cambridge General Certificate of Education Advanced Level (A-level). I aim to inspire more educators to create interactive and open educational resources for the benefit of all. Prototypes: http://iwant2study.org/ospsg/index.php/interactive-resources/phy...

  20. When development matters: From evolutionary psychology to evolutionary developmental psychology

    OpenAIRE

    Hernández Blasi, Carlos; Gardiner, Amy K.; Bjorklund, David F.

    2008-01-01

    This article presents evolutionary developmental psychology (EDP) as an emerging field of evolutionary psychology (EP). In describing the core tenets of both approaches and the differences between them, we emphasize the important roles that evolution and development have in understanding human behaviour. We suggest that developmental psychologists should pay more attention to evolutionary issues and, conversely, evolutionary psychologists should take development seriously. Key words: evol...

  1. MELEC: Meta-Level Evolutionary Composer

    Directory of Open Access Journals (Sweden)

    Andres Calvo

    2011-02-01

    Full Text Available Genetic algorithms (GA’s are global search mechanisms that have been applied to many disciplines including music composition. Computer system MELEC composes music using evolutionary computation on two levels: the object and the meta. At the object-level, MELEC employs GAs to compose melodic motifs and iteratively refine them through evolving generations. At the meta-level, MELEC forms the overall musical structure by concatenating the generated motifs in an order that depends on the evolutionary process. In other words, the structure of the music is determined by a geneological traversal of the algorithm’s execution sequence. In this implementation, we introduce a new data structure that tracks the execution of the GA, the Genetic Algorithm Traversal Tree, and uses its traversal to define the musical structure. Moreover, we employ a Fibonacci-based fitness function to shape the melodic evolution.

  2. The Computer-based Health Evaluation Software (CHES: a software for electronic patient-reported outcome monitoring

    Directory of Open Access Journals (Sweden)

    Holzner Bernhard

    2012-11-01

    Full Text Available Abstract Background Patient-reported Outcomes (PROs capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total. Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily

  3. The Computer-based Health Evaluation Software (CHES): a software for electronic patient-reported outcome monitoring

    Science.gov (United States)

    2012-01-01

    Background Patient-reported Outcomes (PROs) capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO) with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System) for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients) to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total). Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily clinical practice and

  4. Dosimetric characterization and application of an imaging beam line with a carbon electron target for megavoltage cone beam computed tomography.

    Science.gov (United States)

    Flynn, Ryan T; Hartmann, Julia; Bani-Hashemi, Ali; Nixon, Earl; Alfredo, R; Siochi, C; Pennington, Edward C; Bayouth, John E

    2009-06-01

    Imaging dose from megavoltage cone beam computed tomography (MVCBCT) can be significantly reduced without loss of image quality by using an imaging beam line (IBL), with no flattening filter and a carbon, rather than tungsten, electron target. The IBL produces a greater keV-range x-ray fluence than the treatment beam line (TBL), which results in a more optimal detector response. The IBL imaging dose is not necessarily negligible, however. In this work an IBL was dosimetrically modeled with the Philips Pinnacle3 treatment planning system (TPS), verified experimentally, and applied to clinical cases. The IBL acquisition dose for a 200 degrees gantry rotation was verified in a customized acrylic cylindrical phantom at multiple imaging field sizes with 196 ion chamber measurements. Agreement between the measured and calculated IBL dose was quantified with the 3D gamma index. Representative IBL and TBL imaging dose distributions were calculated for head and neck and prostate patients and included in treatment plans using the imaging dose incorporation (IDI) method. Surface dose was measured for the TBL and IBL for four head and neck cancer patients with MOSFETs. The IBL model, when compared to the percentage depth dose and profile measurements, had 97% passing gamma indices for dosimetric and distance acceptance criteria of 3%, 3 mm, and 100% passed for 5.2%, 5.2 mm. For the ion chamber measurements of phantom image acquisition dose, the IBL model had 93% passing gamma indices for acceptance criteria of 3%, 3 mm, and 100% passed for 4%, 4 mm. Differences between the IBL- and TBL-based IMRT treatment plans created with the IDI method were dosimetrically insignificant for both the prostate and head and neck cases. For IBL and TBL beams with monitor unit values that would result in the delivery of the same dose to the depth of maximum dose under standard calibration conditions, the IBL imaging surface dose was higher than the TBL imaging surface dose by an average of 18

  5. Origins of evolutionary transitions

    Indian Academy of Sciences (India)

    2014-03-15

    Mar 15, 2014 ... I define an evolutionary transition as a shift in the hierarchical level at which heritable fitness variance ... life, for example in eusocial insects, around 150 million years ago. None of these transformations was ...... affecting and heritable trait, and to introduce a mechanism which inhibits them from subsequent ...

  6. Evolutionary Stable Strategy

    Indian Academy of Sciences (India)

    IAS Admin

    After Maynard-Smith and Price [1] mathematically derived why a given behaviour or strategy was adopted by a certain proportion of the population at a given time, it was shown that a strategy which is currently stable in a population need not be stable in evolutionary time (across generations). Additionally it was sug-.

  7. Evolutionary trends in Heteroptera

    NARCIS (Netherlands)

    Cobben, R.H.

    1968-01-01

    1. This work, the first volume of a series dealing with evolutionary trends in Heteroptera, is concerned with the egg system of about 400 species. The data are presented systematically in chapters 1 and 2 with a critical review of the literature after each family.

    2. Chapter 3 evaluates facts

  8. Towards Adaptive Evolutionary Architecture

    DEFF Research Database (Denmark)

    Bak, Sebastian HOlt; Rask, Nina; Risi, Sebastian

    2016-01-01

    This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how...

  9. Origins of evolutionary transitions.

    Science.gov (United States)

    Clarke, Ellen

    2014-04-01

    An 'evolutionary transition in individuality' or 'major transition' is a transformation in the hierarchical level at which natural selection operates on a population. In this article I give an abstract (i.e. level-neutral and substrate-neutral) articulation of the transition process in order to precisely understand how such processes can happen, especially how they can get started.

  10. Evolutionary pattern search algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hart, W.E.

    1995-09-19

    This paper defines a class of evolutionary algorithms called evolutionary pattern search algorithms (EPSAs) and analyzes their convergence properties. This class of algorithms is closely related to evolutionary programming, evolutionary strategie and real-coded genetic algorithms. EPSAs are self-adapting systems that modify the step size of the mutation operator in response to the success of previous optimization steps. The rule used to adapt the step size can be used to provide a stationary point convergence theory for EPSAs on any continuous function. This convergence theory is based on an extension of the convergence theory for generalized pattern search methods. An experimental analysis of the performance of EPSAs demonstrates that these algorithms can perform a level of global search that is comparable to that of canonical EAs. We also describe a stopping rule for EPSAs, which reliably terminated near stationary points in our experiments. This is the first stopping rule for any class of EAs that can terminate at a given distance from stationary points.

  11. Learning: An Evolutionary Analysis

    Science.gov (United States)

    Swann, Joanna

    2009-01-01

    This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…

  12. Evolutionary mysteries in meiosis

    NARCIS (Netherlands)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E.; Wijnker, Erik; Haag, Christoph R.

    2016-01-01

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these

  13. Editorial overview: Evolutionary psychology

    NARCIS (Netherlands)

    Gangestad, S.W.; Tybur, J.M.

    2016-01-01

    Functional approaches in psychology - which ask what behavior is good for - are almost as old as scientific psychology itself. Yet sophisticated, generative functional theories were not possible until developments in evolutionary biology in the mid-20th century. Arising in the last three decades,

  14. Evolutionary Developmental Psychology.

    Science.gov (United States)

    Geary, David C.; Bjorklund, David F.

    2000-01-01

    Describes evolutionary developmental psychology as the study of the genetic and ecological mechanisms that govern the development of social and cognitive competencies common to all human beings and the epigenetic (gene-environment interactions) processes that adapt these competencies to local conditions. Outlines basic assumptions and domains of…

  15. Origins of evolutionary transitions

    Indian Academy of Sciences (India)

    An `evolutionary transition in individuality' or `major transition' is a transformation in the hierarchical level at which natural selection operates on a population. In this article I give an abstract (i.e. level-neutral and substrate-neutral) articulation of the transition process in order to precisely understand how such processes can ...

  16. Evolutionary Theory under Fire.

    Science.gov (United States)

    Lewin, Roger

    1980-01-01

    Summarizes events of a conference on evolutionary biology in Chicago entitled: "Macroevolution." Reviews the theory of modern synthesis, a term used to explain Darwinism in terms of population biology and genetics. Issues presented at the conference are discussed in detail. (CS)

  17. Evolutionary developmental psychology.

    Science.gov (United States)

    King, Ashley C; Bjorklund, David F

    2010-02-01

    The field of evolutionary developmental psychology can potentially broaden the horizons of mainstream evolutionary psychology by combining the principles of Darwinian evolution by natural selection with the study of human development, focusing on the epigenetic effects that occur between humans and their environment in a way that attempts to explain how evolved psychological mechanisms become expressed in the phenotypes of adults. An evolutionary developmental perspective includes an appreciation of comparative research and we, among others, argue that contrasting the cognition of humans with that of nonhuman primates can provide a framework with which to understand how human cognitive abilities and intelligence evolved. Furthermore, we argue that several aspects of childhood (e.g., play and immature cognition) serve both as deferred adaptations as well as imparting immediate benefits. Intense selection pressure was surely exerted on childhood over human evolutionary history and, as a result, neglecting to consider the early developmental period of children when studying their later adulthood produces an incomplete picture of the evolved adaptations expressed through human behavior and cognition.

  18. Evolutionary Theories of Detection

    Energy Technology Data Exchange (ETDEWEB)

    Fitch, J P

    2005-04-29

    Current, mid-term and long range technologies for detection of pathogens and toxins are briefly described in the context of performance metrics and operational scenarios. Predictive (evolutionary) and speculative (revolutionary) assessments are given with trade-offs identified, where possible, among competing performance goals.

  19. Effectiveness of stringent decontamination of computer input devices in the era of electronic medical records and bedside computing: a randomized controlled trial.

    Science.gov (United States)

    Codish, Shlomi; Toledano, Ronen; Novack, Victor; Sherf, Michael; Borer, Abraham

    2015-06-01

    Bedside computing may lead to increased hospital-acquired infections mediated by computer input devices handled immediately after patient contact. We compared 2 decontamination methods in 2 types of wards. We found high baseline contamination rates, which decreased following decontamination, but the rates remained unacceptably high. Decontamination was more effective in intensive care units compared with medical wards and when using alcohol-based impregnated wipes compared with quaternary ammonium-based impregnated wipes. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. A primary care physician perspective survey on the limited use of handwriting and pen computing in the electronic medical record

    Directory of Open Access Journals (Sweden)

    Gary Arvary

    2002-09-01

    The use of handwriting in the EMR was broadly supported by this group of PCPs in private practice. Likewise, wireless pen computers were the overwhelming choice of computer for use during a consultation. In this group, older and lower volume physicians were less likely to desire a computer for use during a consultation. User acceptance of the EMR may be related to how closely it resembles the processes that are being automated. More surveys are required to determine the needs and expectations of physicians. The data also support other research studies that demonstrate the preference for handwriting and wireless computers, and the need for a limited, standardised and controlled vocabulary.

  1. Evolutionary Thinking in Environmental Economics

    NARCIS (Netherlands)

    van den Bergh, J.C.J.M.

    2007-01-01

    Evolutionary and environmental economics have a potentially close relationship. This paper reviews past and identifies potential applications of evolutionary concepts and methods to environmental economics. This covers a number of themes: resource use and ecosystem management; growth and

  2. Stability of television viewing and electronic game/computer use in a prospective cohort study of Australian children: relationship with body mass index

    Directory of Open Access Journals (Sweden)

    Graham Melissa

    2007-11-01

    Full Text Available Abstract Background While much cross-sectional data is available, there have been few longitudinal investigations of patterns of electronic media use in children. Further, the possibility of a bi-directional relationship between electronic media use and body mass index in children has not been considered. This study aimed to describe longitudinal patterns of television viewing and electronic game/computer use, and investigate relationships with body mass index (BMI. Methods This prospective cohort study was conducted in elementary schools in Victoria, Australia. 1278 children aged 5–10 years at baseline and 8–13 years at follow-up had their BMI calculated, from measured height and weight, and transformed to z-scores based on US 2000 growth data. Weight status (non-overweight, overweight and obese was based on international BMI cut-off points. Weekly television viewing and electronic game/computer use were reported by parents, these were summed to generate total weekly screen time. Children were classified as meeting electronic media use guidelines if their total screen time was ≤14 hrs/wk. Results Electronic media use increased over the course of the study; 40% met guidelines at baseline but only 18% three years later. Television viewing and electronic game/computer use tracked moderately and total screen time was positively associated with adiposity cross-sectionally. While weaker relationships with adiposity were observed longitudinally, baseline z-BMI and weight status were positively associated with follow-up screen time and baseline screen time was positively associated with z-BMI and weight status at follow-up. Children who did not meet guidelines at baseline had significantly higher z-BMI and were more likely to be classified as overweight/obese at follow-up. Conclusion Electronic media use in Australian elementary school children is high, increases with age and tracks over time. There appears to be a bi-directional association

  3. ALR-46 Computer Graphics System for the Robins Air Force Base Electronic Warfare Division Engineering Branch Laboratory.

    Science.gov (United States)

    1981-12-01

    Warfare EWAISF Electronic Warfare Avionics Integration Support Facility EWOLS Electronic Warfare Open Loop Simulator GRA Graphics Command HIPO ...Softech’s Structured Analysis and Design Technique (SADT)(Ref 16). and IBM’s Hierarchical Input-Process-Output ( HIPO ) diagrams (Ref 8). Structured

  4. Electronic Commerce

    OpenAIRE

    Slavko Đerić

    2016-01-01

    Electronic commerce can be defined in different ways. Any definition helps to understand and explain that concept as better as possible.. Electronic commerce is a set of procedures and technologies that automate the tasks of financial transactions using electronic means. Also, according to some authors, electronic commerce is defined as a new concept, which is being developed and which includes process of buying and selling or exchanging products, services or information via computer networks...

  5. The association between computer literacy and training on clinical productivity and user satisfaction in using the electronic medical record in Saudi Arabia.

    Science.gov (United States)

    Alasmary, May; El Metwally, Ashraf; Househ, Mowafa

    2014-08-01

    The association of computer literacy, training on clinical productivity and satisfaction of a recently implemented Electronic Medical Record (EMR) system in Prince Sultan Medical Military City ((PSMMC)) was investigated. The scope of this study was to explore the association between age, occupation and computer literacy and clinical productivity and users' satisfaction of the newly implemented EMR at PSMMC as well as the association of user satisfaction with age and position. A self-administrated questionnaire was distributed to all doctors and nurses working in Alwazarat Family and Community Center (a Health center in PSMMC). A convenience sample size of 112 healthcare providers (65 Nurses and 47 physicians) completed the questionnaire. A combination of correlation, One Way ANOVA and t-tests were used to answer the research questions. Participants had high levels of self-reported literacy on computers and satisfaction of the system. Both levels were higher among physicians than among nurses. A moderate but significant (at p computer literacy and users' satisfaction towards the system (R = 0.343). Age was weakly, but significantly (at p computer literacy skills were more satisfied with using the EMR than users with low computer literacy skills.

  6. Using the Electronic Industry Code of Conduct to Evaluate Green Supply Chain Management: An Empirical Study of Taiwan’s Computer Industry

    Directory of Open Access Journals (Sweden)

    Ching-Ching Liu

    2015-03-01

    Full Text Available Electronics companies throughout Asia recognize the benefits of Green Supply Chain Management (GSCM for gaining competitive advantage. A large majority of electronics companies in Taiwan have recently adopted the Electronic Industry Citizenship Coalition (EICC Code of Conduct for defining and managing their social and environmental responsibilities throughout their supply chains. We surveyed 106 Tier 1 suppliers to the Taiwanese computer industry to determine their environmental performance using the EICC Code of Conduct (EICC Code and performed Analysis of Variance (ANOVA on the 63/106 questionnaire responses collected. We test the results to determine whether differences in product type, geographic area, and supplier size correlate with different levels of environmental performance. To our knowledge, this is the first study to analyze questionnaire data on supplier adoption to optimize the implementation of GSCM. The results suggest that characteristic classification of suppliers could be employed to enhance the efficiency of GSCM.

  7. Getting on with your computer is associated with job satisfaction in primary care: entrants to primary care should be assessed for their competency with electronic patient record systems

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2014-02-01

    Full Text Available Job satisfaction in primary care is associated with getting on with your computer. Many primary care professionals spend longer interacting with their computer than anything else in their day. However, the computer often makes demands rather than be an aid or supporter that has learned its user’s preferences. The use of electronic patient record (EPR systems is underrepresented in the assessment of entrants to primary care, and in definitions of the core competencies of a family physician/general practitioner. We call for this to be put right: for the use of the EPR to support direct patient care and clinical governance to be given greater prominence in training and assessment. In parallel, policy makers should ensure that the EPR system use is orientated to ensuring patients receive evidence-based care, and EPR system suppliers should explore how their systems might better support their clinician users, in particular learning their preferences.

  8. Evolutionary design assistants for architecture

    Directory of Open Access Journals (Sweden)

    N. Onur Sönmez

    2015-04-01

    Full Text Available In its parallel pursuit of an increased competitivity for design offices and more pleasurable and easier workflows for designers, artificial design intelligence is a technical, intellectual, and political challenge. While human-machine cooperation has become commonplace through Computer Aided Design (CAD tools, a more improved collaboration and better support appear possible only through an endeavor into a kind of artificial design intelligence, which is more sensitive to the human perception of affairs. Considered as part of the broader Computational Design studies, the research program of this quest can be called Artificial / Autonomous / Automated Design (AD. The current available level of Artificial Intelligence (AI for design is limited and a viable aim for current AD would be to develop design assistants that are capable of producing drafts for various design tasks. Thus, the overall aim of this thesis is the development of approaches, techniques, and tools towards artificial design assistants that offer a capability for generating drafts for sub-tasks within design processes. The main technology explored for this aim is Evolutionary Computation (EC, and the target design domain is architecture. The two connected research questions of the study concern, first, the investigation of the ways to develop an architectural design assistant, and secondly, the utilization of EC for the development of such assistants. While developing approaches, techniques, and computational tools for such an assistant, the study also carries out a broad theoretical investigation into the main problems, challenges, and requirements towards such assistants on a rather overall level. Therefore, the research is shaped as a parallel investigation of three main threads interwoven along several levels, moving from a more general level to specific applications. The three research threads comprise, first, theoretical discussions and speculations with regard to both

  9. Evolutionary design assistants for architecture

    Directory of Open Access Journals (Sweden)

    N. Onur Sönmez

    2015-04-01

    Full Text Available In its parallel pursuit of an increased competitivity for design offices and more pleasurable and easier workflows for designers, artificial design intelligence is a technical, intellectual, and political challenge. While human-machine cooperation has become commonplace through Computer Aided Design (CAD tools, a more improved collaboration and better support appear possible only through an endeavor into a kind of artificial design intelligence, which is more sensitive to the human perception of affairs.Considered as part of the broader Computational Design studies, the research program of this quest can be called Artificial / Autonomous / Automated Design (AD. The current available level of Artificial Intelligence (AI for design is limited and a viable aim for current AD would be to develop design assistants that are capable of producing drafts for various design tasks. Thus, the overall aim of this thesis is the development of approaches, techniques, and tools towards artificial design assistants that offer a capability for generating drafts for sub-tasks within design processes. The main technology explored for this aim is Evolutionary Computation (EC, and the target design domain is architecture. The two connected research questions of the study concern, first, the investigation of the ways to develop an architectural design assistant, and secondly, the utilization of EC for the development of such assistants.While developing approaches, techniques, and computational tools for such an assistant, the study also carries out a broad theoretical investigation into the main problems, challenges, and requirements towards such assistants on a rather overall level. Therefore, the research is shaped as a parallel investigation of three main threads interwoven along several levels, moving from a more general level to specific applications. The three research threads comprise, first, theoretical discussions and speculations with regard to both existing

  10. Computation of electron transport and relaxation properties in gases based on improved multi-term approximation of Boltzmann equation

    Science.gov (United States)

    Cai, X. J.; Wang, X. X.; Zou, X. B.; Lu, Z. W.

    2018-01-01

    An understanding of electron kinetics is of importance in various applications of low temperature plasmas. We employ a series of model and real gases to investigate electron transport and relaxation properties based on improved multi-term approximation of the Boltzmann equation. First, a comparison of different methods to calculate the interaction integrals has been carried out; the effects of free parameters, such as vmax, lmax, and the arbitrary temperature Tb, on the convergence of electron transport coefficients are analyzed. Then, the modified attachment model of Ness et al. and SF6 are considered to investigate the effect of attachment on the electron transport properties. The deficiency of the pulsed Townsend technique to measure the electron transport and reaction coefficients in electronegative gases is highlighted when the reduced electric field is small. In order to investigate the effect of external magnetic field on the electron transport properties, Ar plasmas in high power impulse sputtering devices are considered. In the end, the electron relaxation properties of the Reid model under the influence of electric and magnetic fields are demonstrated.

  11. Using diagnostic radiology in human evolutionary studies

    Science.gov (United States)

    SPOOR, FRED; JEFFERY, NATHAN; ZONNEVELD, FRANS

    2000-01-01

    This paper reviews the application of medical imaging and associated computer graphics techniques to the study of human evolutionary history, with an emphasis on basic concepts and on the advantages and limitations of each method. Following a short discussion of plain film radiography and pluridirectional tomography, the principles of computed tomography (CT) and magnetic resonance imaging (MRI) and their role in the investigation of extant and fossil morphology are considered in more detail. The second half of the paper deals with techniques of 3-dimensional visualisation based on CT and MRI and with quantitative analysis of digital images. PMID:10999271

  12. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  13. Studies in evolutionary agroecology

    DEFF Research Database (Denmark)

    Wille, Wibke

    Darwinian evolution by natural selection is driven primarily by differential survival and reproduction among individuals in a population. When the evolutionary interest of an individual is in conflict with the interests of the population, the genes increasing individual fitness at the cost...... performance are not in conflict, it is unlikely that plant breeding can radically improve the results of millions of years of evolution through natural selection. However, efforts to improve crops can be very successful, when breeding is directed towards goals diverging from natural selection. The potential...... of Evolutionary Agroecology that the highest yielding individuals do not necessarily perform best as a population. The investment of resources into strategies and structures increasing individual competitive ability carries a cost. If a whole population consists of individuals investing resources to compete...

  14. Evolutionary constrained optimization

    CERN Document Server

    Deb, Kalyanmoy

    2015-01-01

    This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...

  15. Automated generation of patient-tailored electronic care pathways by translating computer-interpretable guidelines into hierarchical task networks

    NARCIS (Netherlands)

    González-Ferrer, A.; ten Teije, A.C.M.; Fdez-Olivares, J.; Milian, K.

    OBJECTIVE: This paper describes a methodology which enables computer-aided support for the planning, visualization and execution of personalized patient treatments in a specific healthcare process, taking into account complex temporal constraints and the allocation of institutional resources. To

  16. Anxiety: an evolutionary approach.

    OpenAIRE

    Bateson, M; Brilot, B; Nettle, D.

    2011-01-01

    Anxiety disorders are among the most common mental illnesses, with huge attendant suffering. Current treatments are not universally effective, suggesting that a deeper understanding of the causes of anxiety is needed. To understand anxiety disorders better, it is first necessary to understand the normal anxiety response. This entails considering its evolutionary function as well as the mechanisms underlying it. We argue that the function of the human anxiety response, and homologues in other ...

  17. The Evolutionary Origins of Hierarchy.

    Directory of Open Access Journals (Sweden)

    Henok Mengistu

    2016-06-01

    Full Text Available Hierarchical organization-the recursive composition of sub-modules-is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments. Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force-the cost of connections-promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

  18. Asymmetric Evolutionary Games.

    Directory of Open Access Journals (Sweden)

    Alex McAvoy

    2015-08-01

    Full Text Available Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner's Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games.

  19. Evolutionary theory and teleology.

    Science.gov (United States)

    O'Grady, R T

    1984-04-21

    The order within and among living systems can be explained rationally by postulating a process of descent with modification, effected by factors which are extrinsic or intrinsic to the organisms. Because at the time Darwin proposed his theory of evolution there was no concept of intrinsic factors which could evolve, he postulated a process of extrinsic effects--natural selection. Biological order was thus seen as an imposed, rather than an emergent, property. Evolutionary change was seen as being determined by the functional efficiency (adaptedness) of the organism in its environment, rather than by spontaneous changes in intrinsically generated organizing factors. The initial incompleteness of Darwin's explanatory model, and the axiomatization of its postulates in neo-Darwinism, has resulted in a theory of functionalism, rather than structuralism. As such, it introduces an unnecessary teleology which confounds evolutionary studies and reduces the usefulness of the theory. This problem cannot be detected from within the neo-Darwinian paradigm because the different levels of end-directed activity--teleomatic, teleonomic, and teleological--are not recognized. They are, in fact, considered to influence one another. The theory of nonequilibrium evolution avoids these problems by returning to the basic principles of biological order and developing a structuralist explanation of intrinsically generated change. Extrinsic factors may affect the resultant evolutionary pattern, but they are neither necessary nor sufficient for evolution to occur.

  20. Evolutionary mysteries in meiosis.

    Science.gov (United States)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E; Wijnker, Erik; Haag, Christoph R

    2016-10-19

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these often 'weird' features. We discuss the origin of meiosis (origin of ploidy reduction and recombination, two-step meiosis), its secondary modifications (in polyploids or asexuals, inverted meiosis), its importance in punctuating life cycles (meiotic arrests, epigenetic resetting, meiotic asymmetry, meiotic fairness) and features associated with recombination (disjunction constraints, heterochiasmy, crossover interference and hotspots). We present the various evolutionary scenarios and selective pressures that have been proposed to account for these features, and we highlight that their evolutionary significance often remains largely mysterious. Resolving these mysteries will likely provide decisive steps towards understanding why sex and recombination are found in the majority of eukaryotes.This article is part of the themed issue 'Weird sex: the underappreciated diversity of sexual reproduction'. © 2016 The Author(s).

  1. Asymmetric Evolutionary Games.

    Science.gov (United States)

    McAvoy, Alex; Hauert, Christoph

    2015-08-01

    Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner's Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games.

  2. Practical electronics handbook

    CERN Document Server

    Sinclair, Ian R

    2013-01-01

    Practical Electronics Handbook, Third Edition provides the frequently used and highly applicable principles of electronics and electronic circuits.The book contains relevant information in electronics. The topics discussed in the text include passive and active discrete components; linear and digital I.C.s; microprocessors and microprocessor systems; digital-analogue conversions; computer aids in electronics design; and electronic hardware components.Electronic circuit constructors, service engineers, electronic design engineers, and anyone with an interest in electronics will find the book ve

  3. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  4. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    Science.gov (United States)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  5. Computational intelligence in optimization

    CERN Document Server

    Tenne, Yoel

    2010-01-01

    This volume presents a collection of recent studies covering the spectrum of computational intelligence applications with emphasis on their application to challenging real-world problems. Topics covered include: Intelligent agent-based algorithms, Hybrid intelligent systems, Cognitive and evolutionary robotics, Knowledge-Based Engineering, fuzzy sets and systems, Bioinformatics and Bioengineering, Computational finance and Computational economics, Data mining, Machine learning, and Expert systems. ""Computational Intelligence in Optimization"" is a comprehensive reference for researchers, prac

  6. Excited-state intramolecular hydrogen transfer (ESIHT) of 1,8-Dihydroxy-9,10-anthraquinone (DHAQ) characterized by ultrafast electronic and vibrational spectroscopy and computational modeling

    KAUST Repository

    Mohammed, Omar F.

    2014-05-01

    We combine ultrafast electronic and vibrational spectroscopy and computational modeling to investigate the photoinduced excited-state intramolecular hydrogen-transfer dynamics in 1,8-dihydroxy-9,10-anthraquinone (DHAQ) in tetrachloroethene, acetonitrile, dimethyl sulfoxide, and methanol. We analyze the electronic excited states of DHAQ with various possible hydrogen-bonding schemes and provide a general description of the electronic excited-state dynamics based on a systematic analysis of femtosecond UV/vis and UV/IR pump-probe spectroscopic data. Upon photoabsorption at 400 nm, the S 2 electronic excited state is initially populated, followed by a rapid equilibration within 150 fs through population transfer to the S 1 state where DHAQ exhibits ESIHT dynamics. In this equilibration process, the excited-state population is distributed between the 9,10-quinone (S2) and 1,10-quinone (S1) states while undergoing vibrational energy redistribution, vibrational cooling, and solvation dynamics on the 0.1-50 ps time scale. Transient UV/vis pump-probe data in methanol also suggest additional relaxation dynamics on the subnanosecond time scale, which we tentatively ascribe to hydrogen bond dynamics of DHAQ with the protic solvent, affecting the equilibrium population dynamics within the S2 and S1 electronic excited states. Ultimately, the two excited singlet states decay with a solvent-dependent time constant ranging from 139 to 210 ps. The concomitant electronic ground-state recovery is, however, only partial because a large fraction of the population relaxes to the first triplet state. From the similarity of the time scales involved, we conjecture that the solvent plays a crucial role in breaking the intramolecular hydrogen bond of DHAQ during the S2/S1 relaxation to either the ground or triplet state. © 2014 American Chemical Society.

  7. THE DESIGNING OF ELECTRONIC TEACHING-METHODS COMPLEX «GRAPHICS» FOR REALIZATION OF COMPUTER-BASED LEARNING OF ENGINEERING-GRAPHIC DISCIPLINES

    Directory of Open Access Journals (Sweden)

    Іван Нищак

    2015-12-01

    Full Text Available The article contains Theoretical Foundations of designing of author’s electronic educational-methodical complex (EEMC «Graphics», intended to implement the engineering-graphic preparation of future teachers of technology in terms of computer-based learning. The process of designing of electronic educational-methodical complex “Graphics” includes the following successive stages: 1 identification of didactic goals and objectives; 2the designing of patterns of EEMC; 3 the selection of contents and systematization of educational material; 4 the program-technical implementation of EEMC; 5 interface design; 6 expert assessment of quality of EEMC; 7 testing of EEMC; 8 adjusting the software; 9 the development of guidelines and instructions for the use of EEMC.

  8. Electronic Devices, Methods, and Computer Program Products for Selecting an Antenna Element Based on a Wireless Communication Performance Criterion

    DEFF Research Database (Denmark)

    2014-01-01

    A method of operating an electronic device includes providing a plurality of antenna elements, evaluating a wireless communication performance criterion to obtain a performance evaluation, and assigning a first one of the plurality of antenna elements to a main wireless signal reception and trans...

  9. Computer simulations analysis for determining the polarity of charge generated by high energy electron irradiation of a thin film

    DEFF Research Database (Denmark)

    Malac, Marek; Hettler, Simon; Hayashida, Misa

    2017-01-01

    Detailed simulations are necessary to correctly interpret the charge polarity of electron beam irradiated thin film patch. Relying on systematic simulations we provide guidelines and movies to interpret experimentally the polarity of the charged area, to be understood as the sign of the electrost...

  10. The Effect of Distributing Electronic Notes to Students: Ethical Considerations Raised by Computer Science Faculty at the University of Namibia

    Science.gov (United States)

    Mufeti, Tulimevava Kaunapawa; Mbale, Jameson; Suresh, Nalina

    2011-01-01

    In an effort to encourage the uptake of technology among its academic community, the University of Namibia (UNAM) introduced the Electronic Notes System (ENS) in the year 2010. The ENS was envisaged as a web-based method of distributing lecture notes to students, where the faculty members would upload the teaching materials and the students would…

  11. Computational Search for Two-Dimensional MX2 Semiconductors with Possible High Electron Mobility at Room Temperature

    Directory of Open Access Journals (Sweden)

    Zhishuo Huang

    2016-08-01

    Full Text Available Neither of the two typical two-dimensional materials, graphene and single layer MoS 2 , are good enough for developing semiconductor logical devices. We calculated the electron mobility of 14 two-dimensional semiconductors with composition of MX 2 , where M (=Mo, W, Sn, Hf, Zr and Pt are transition metals, and Xs are S, Se and Te. We approximated the electron phonon scattering matrix by deformation potentials, within which long wave longitudinal acoustical and optical phonon scatterings were included. Piezoelectric scattering in the compounds without inversion symmetry is also taken into account. We found that out of the 14 compounds, WS 2 , PtS 2 and PtSe 2 are promising for logical devices regarding the possible high electron mobility and finite band gap. Especially, the phonon limited electron mobility in PtSe 2 reaches about 4000 cm 2 ·V - 1 ·s - 1 at room temperature, which is the highest among the compounds with an indirect bandgap of about 1.25 eV under the local density approximation. Our results can be the first guide for experiments to synthesize better two-dimensional materials for future semiconductor devices.

  12. Experimentally calibrated computational chemistry of tryptophan hydroxylase: Trans influence, hydrogen-bonding, and 18-electron rule govern O-2-activation

    DEFF Research Database (Denmark)

    Haahr, Lærke Tvedebrink; Kepp, Kasper Planeta; Boesen, Jane

    2010-01-01

    a two-state scenario involving Ohis and Pglu is possible. A structure of the activated deoxy state which is high-spin implies that the valence electron count has been lowered from 18 to 16 (glutamate becomes bidentate), giving a “green light” that invites O2-binding. Our mechanism of oxygen activation...

  13. Turbopump Performance Improved by Evolutionary Algorithms

    Science.gov (United States)

    Oyama, Akira; Liou, Meng-Sing

    2002-01-01

    The development of design optimization technology for turbomachinery has been initiated using the multiobjective evolutionary algorithm under NASA's Intelligent Synthesis Environment and Revolutionary Aeropropulsion Concepts programs. As an alternative to the traditional gradient-based methods, evolutionary algorithms (EA's) are emergent design-optimization algorithms modeled after the mechanisms found in natural evolution. EA's search from multiple points, instead of moving from a single point. In addition, they require no derivatives or gradients of the objective function, leading to robustness and simplicity in coupling any evaluation codes. Parallel efficiency also becomes very high by using a simple master-slave concept for function evaluations, since such evaluations often consume the most CPU time, such as computational fluid dynamics. Application of EA's to multiobjective design problems is also straightforward because EA's maintain a population of design candidates in parallel. Because of these advantages, EA's are a unique and attractive approach to real-world design optimization problems.

  14. An Evolutionary Model of DNA Substring Distribution

    Science.gov (United States)

    Kull, Meelis; Tretyakov, Konstantin; Vilo, Jaak

    DNA sequence analysis methods, such as motif discovery, gene detection or phylogeny reconstruction, can often provide important input for biological studies. Many of such methods require a background model, representing the expected distribution of short substrings in a given DNA region. Most current techniques for modeling this distribution disregard the evolutionary processes underlying DNA formation. We propose a novel approach for modeling DNA k-mer distribution that is capable of taking the notions of evolution and natural selection into account. We derive a computionally tractable approximation for estimating k-mer probabilities at genetic equilibrium, given a description of evolutionary processes in terms of fitness and mutation probabilities. We assess the goodness of this approximation via numerical experiments. Besides providing a generative model for DNA sequences, our method has further applications in motif discovery.

  15. Evolutionary dynamics on any population structure

    Science.gov (United States)

    Allen, Benjamin; Lippner, Gabor; Chen, Yu-Ting; Fotouhi, Babak; Momeni, Naghmeh; Yau, Shing-Tung; Nowak, Martin A.

    2017-03-01

    Evolution occurs in populations of reproducing individuals. The structure of a population can affect which traits evolve. Understanding evolutionary game dynamics in structured populations remains difficult. Mathematical results are known for special structures in which all individuals have the same number of neighbours. The general case, in which the number of neighbours can vary, has remained open. For arbitrary selection intensity, the problem is in a computational complexity class that suggests there is no efficient algorithm. Whether a simple solution for weak selection exists has remained unanswered. Here we provide a solution for weak selection that applies to any graph or network. Our method relies on calculating the coalescence times of random walks. We evaluate large numbers of diverse population structures for their propensity to favour cooperation. We study how small changes in population structure—graph surgery—affect evolutionary outcomes. We find that cooperation flourishes most in societies that are based on strong pairwise ties.

  16. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Sanshui; He Sailing

    2002-12-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k{sub z} although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k{sub z} is founded.

  17. Automated Antenna Design with Evolutionary Algorithms

    Science.gov (United States)

    Hornby, Gregory S.; Globus, Al; Linden, Derek S.; Lohn, Jason D.

    2006-01-01

    Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to

  18. Proceedings of the International Workshop on Computational Electronics Held at Leeds University (United Kingdom) on August 11-13 1993

    Science.gov (United States)

    1993-08-01

    Technology Modeling Associates, or MASTERPIECE from Silvaco Data Systems. Depending on the intentions of the creator of the frameworks, different aspects...in photoconductors using an Elliot 803 computer with 32K 102 3 3 3 • 3 31 core memory. A little later the offer of time on the national ATLAS

  19. Differences in Electronic Exchanges in Synchronous and Asynchronous Computer-Mediated Communication: The Effect of Culture as a Mediating Variable

    Science.gov (United States)

    Angeli, Charoula; Schwartz, Neil H.

    2016-01-01

    Two hundred and eighty undergraduates from universities in two countries were asked to read didactic material, and then think and write about potential solutions to an ill-defined problem. The writing was conducted within a synchronous or asynchronous computer-mediated communication (CMC) environment. Asynchronous CMC took the form of email…

  20. Evolutionary genomics of Entamoeba

    Science.gov (United States)

    Weedall, Gareth D.; Hall, Neil

    2011-01-01

    Entamoeba histolytica is a human pathogen that causes amoebic dysentery and leads to significant morbidity and mortality worldwide. Understanding the genome and evolution of the parasite will help explain how, when and why it causes disease. Here we review current knowledge about the evolutionary genomics of Entamoeba: how differences between the genomes of different species may help explain different phenotypes, and how variation among E. histolytica parasites reveals patterns of population structure. The imminent expansion of the amount genome data will greatly improve our knowledge of the genus and of pathogenic species within it. PMID:21288488

  1. Electronic structure and rovibrational properties of ZnOH in the X{sup ~2}A{sup ′} electronic state: A computational molecular spectroscopy study

    Energy Technology Data Exchange (ETDEWEB)

    Hirano, Tsuneo, E-mail: hirano@nccsk.com [Department of Chemistry, Faculty of Science, Ochanomizu University, 2-1-1 Otsuka, Bunkyo-ku, Tokyo 112-8610 (Japan); Andaloussi, Mounir Ben Dahman; Jensen, Per, E-mail: jensen@uni-wuppertal.de [Physikalische und Theoretische Chemie, Bergische Universität, D-42097 Wuppertal (Germany); Nagashima, Umpei [Nanosystem Research Institute, National Institute of Advanced Industrial Science and Technology, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568 (Japan)

    2014-09-07

    The three-dimensional ground-state potential energy surface of ZnOH has been calculated ab initio at the MR-SDCI+Q-DK3/[QZP ANO-RCC (Zn, O, H)] level of theory and used as basis for a study of the rovibrational properties carried out by means of the program MORBID (Morse Oscillator Rigid Bender Internal Dynamics). The electronic ground state is  {sup 2}A′ (correlating with {sup 2}Σ{sup +} at the linear configuration). The equilibrium structure has r{sub e}(Zn–O) = 1.8028 Å, r{sub e}(O–H) = 0.9606 Å, and ∠{sub e}(Zn–O–H) = 114.9°. The Zn–O bond is essentially ionic, with appreciable covalency. The bonding character is compared with those of FeOH (quasi-linear) and CsOH (linear). The rovibrationally averaged structural parameters, determined as expectation values over MORBID wavefunctions, are 〈r(Zn–O)〉{sub 0} = 1.8078 Å, 〈r(O–H)〉{sub 0} = 0.9778 Å, and 〈∠(Zn–O–H)〉{sub 0} = 117°. The Yamada-Winnewisser quasi-linearity parameter is found to be γ{sub 0} = 0.84, which is close to 1.0 as expected for a bent molecule. Since no experimental rovibrational spectrum has been reported thus far, this spectrum has been simulated from the ab initio potential energy and dipole moment surfaces. The amphoteric character of ZnOH is also discussed.

  2. Evolutionary engineering of Saccharomyces cerevisiae for improved industrially important properties.

    Science.gov (United States)

    Cakar, Z Petek; Turanli-Yildiz, Burcu; Alkim, Ceren; Yilmaz, Ulkü

    2012-03-01

    This article reviews evolutionary engineering of Saccharomyces cerevisiae. Following a brief introduction to the 'rational' metabolic engineering approach and its limitations such as extensive genetic and metabolic information requirement on the organism of interest, complexity of cellular physiological responses, and difficulties of cloning in industrial strains, evolutionary engineering is discussed as an alternative, inverse metabolic engineering strategy. Major evolutionary engineering applications with S. cerevisiae are then discussed in two general categories: (1) evolutionary engineering of substrate utilization and product formation and (2) evolutionary engineering of stress resistance. Recent developments in functional genomics methods allow rapid identification of the molecular basis of the desired phenotypes obtained by evolutionary engineering. To conclude, when used alone or in combination with rational metabolic engineering and/or computational methods to study and analyze processes of adaptive evolution, evolutionary engineering is a powerful strategy for improvement in industrially important, complex properties of S. cerevisiae. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  3. Quantum Computational Studies of Electron Transfer in Respiratory Complex III and its Application for Designing New Mitocan Drugs

    Science.gov (United States)

    Hagras, Muhammad Ahmed

    Electron transfer occurs in many biological systems which are imperative to sustain life; oxidative phosphorylation in prokaryotes and eukaryotes, and photophosphorylation in photosynthetic and plant cells are well-balanced and complementary processes. Investigating electron transfer in those natural systems provides detailed knowledge of the atomistic events that lead eventually to production of ATP, or harvesting light energy. Ubiquinol:cytochrome c oxidoreductase complex (also known as bc 1 complex, or respiratory complex III) is a middle player in the electron transport proton pumping orchestra, located in the inner-mitochondrial membrane in eukaryotes or plasma membrane in prokaryotes, which converts the free energy of redox reactions to electrochemical proton gradient across the membrane, following the fundamental chemiosmotic principle discovered by Peter Mitchell 1. In humans, the malfunctioned bc1 complex plays a major role in many neurodegenerative diseases, stress-induced aging, and cancer development, because it produces most of the reactive oxygen species, which are also involved in cellular signaling 2. The mitochondrial bc1 complex has an intertwined dimeric structure comprised of 11 subunits in each monomer, but only three of them have catalytic function, and those are the only domains found in bacterial bc1 complex. The core subunits include: Rieske domain, which incorporates iron-sulfur cluster [2Fe-2S]; trans-membrane cytochrome b domain, incorporating low-potential heme group (heme b L) and high-potential heme group (heme b H); and cytochrome c1 domain, containing heme c1 group and two separate binding sites, Qo (or QP) site where the hydrophobic electron carrier ubihydroquinol QH2 is oxidized, and Qi (or QN) site where ubiquinone molecule Q is reduced 3. Electrons and protons in the bc1 complex flow according to the proton-motive Q-cycle proposed by Mitchell, which includes a unique electron flow bifurcation at the Qo site. At this site, one

  4. Electronic discourse

    OpenAIRE

    Locher, Miriam A.

    2014-01-01

    This chapter deals with electronic discourse by discussing the pragmatics of language use in computer-mediated settings. In many so-called first world countries, accessing the Internet by means of a computer or a smartphone, etc. has become an everyday activity for many people. In only little more than twenty years of publicly accessible Internet access, the use of computer-mediated forms of communication has developed from primarily information websites and email exchanges to highly interact...

  5. Comparative genomics and evolutionary biology.

    Science.gov (United States)

    Kondrashov, A S

    1999-12-01

    Data of large-scale DNA sequencing are relevant to some of the most fundamental issues in evolutionary biology: suboptimality, homology, hierarchy, ancestry, novelties, the role of natural selection, and the relative importance of directional versus stabilizing selection. Already, these data provided the best available evidence for some evolutionary phenomena, and in several cases led to refinement of old concepts. Still, the Darwinian evolutionary paradigm will successfully accommodate comparative genomics.

  6. Computational study of the effective three-ion interaction potentials in liquid metals with high density of electron gas

    OpenAIRE

    Vasiliu, E. V.

    2002-01-01

    Based on the many-body theory of metals in the third order of the perturbation expansion in electron-ion interaction pseudopotential, the potentials of pair and three-ion interactions are calculated in liquid lead, aluminium and beryllium at their melting temperatures. The reducible and irreducible three-ion interactions have an attractive nature on distances approximately equal to an average distance between ions in metals. It results in the shortening of average interatomic distance in an e...

  7. A statistical approach to computer processing of cryo-electron microscope images: virion classification and 3-D reconstruction.

    Science.gov (United States)

    Yin, Zhye; Zheng, Yili; Doerschuk, Peter C; Natarajan, Padmaja; Johnson, John E

    2003-01-01

    The scattering density of the virus is represented as a truncated weighted sum of orthonormal basis functions in spherical coordinates, where the angular dependence of each basis function has icosahedral symmetry. A statistical model of the image formation process is proposed and the maximum likelihood estimation method computed by an expectation-maximization algorithm is used to estimate the weights in the sum and thereby compute a 3-D reconstruction of the virus particle. If multiple types of virus particle are represented in the boxed images then multiple 3-D reconstructions are computed simultaneously without first requiring that the type of particle shown in each boxed image be determined. Examples of the procedure are described for viruses with known structure: (1). 3-D reconstruction of Flockhouse Virus from experimental images, (2). 3-D reconstruction of the capsid of Nudaurelia Omega Capensis Virus from synthetic images, and (3). 3-D reconstruction of both the capsid and the procapsid of Nudaurelia Omega Capensis Virus from a mixture of unclassified synthetic images.

  8. Evolutionary multi-agent systems from inspirations to applications

    CERN Document Server

    Byrski, Aleksander

    2017-01-01

    This book addresses agent-based computing, concentrating in particular on evolutionary multi-agent systems (EMAS), which have been developed since 1996 at the AGH University of Science and Technology in Cracow, Poland. It provides the relevant background information on and a detailed description of this computing paradigm, along with key experimental results. Readers will benefit from the insightful discussion, which primarily concerns the efficient implementation of computing frameworks for developing EMAS and similar computing systems, as well as a detailed formal model. Theoretical deliberations demonstrating that computing with EMAS always helps to find the optimal solution are also included, rounding out the coverage.

  9. The Effect of In-Service Training of Computer Science Teachers on Scratch Programming Language Skills Using an Electronic Learning Platform on Programming Skills and the Attitudes towards Teaching Programming

    Science.gov (United States)

    Alkaria, Ahmed; Alhassan, Riyadh

    2017-01-01

    This study was conducted to examine the effect of in-service training of computer science teachers in Scratch language using an electronic learning platform on acquiring programming skills and attitudes towards teaching programming. The sample of this study consisted of 40 middle school computer science teachers. They were assigned into two…

  10. Evolutionary explanations for natural language: criteria from evolutionary biology

    NARCIS (Netherlands)

    Zuidema, W.; de Boer, B.

    2008-01-01

    Theories of the evolutionary origins of language must be informed by empirical and theoretical results from a variety of different fields. Complementing recent surveys of relevant work from linguistics, animal behaviour and genetics, this paper surveys the requirements on evolutionary scenarios that

  11. Parallel Evolutionary Optimization for Neuromorphic Network Training

    Energy Technology Data Exchange (ETDEWEB)

    Schuman, Catherine D [ORNL; Disney, Adam [University of Tennessee (UT); Singh, Susheela [North Carolina State University (NCSU), Raleigh; Bruer, Grant [University of Tennessee (UT); Mitchell, John Parker [University of Tennessee (UT); Klibisz, Aleksander [University of Tennessee (UT); Plank, James [University of Tennessee (UT)

    2016-01-01

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impact the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.

  12. Anxiety: an evolutionary approach.

    Science.gov (United States)

    Bateson, Melissa; Brilot, Ben; Nettle, Daniel

    2011-12-01

    Anxiety disorders are among the most common mental illnesses, with huge attendant suffering. Current treatments are not universally effective, suggesting that a deeper understanding of the causes of anxiety is needed. To understand anxiety disorders better, it is first necessary to understand the normal anxiety response. This entails considering its evolutionary function as well as the mechanisms underlying it. We argue that the function of the human anxiety response, and homologues in other species, is to prepare the individual to detect and deal with threats. We use a signal detection framework to show that the threshold for expressing the anxiety response ought to vary with the probability of threats occurring, and the individual's vulnerability to them if they do occur. These predictions are consistent with major patterns in the epidemiology of anxiety. Implications for research and treatment are discussed.

  13. Conceptual foundations of evolutionary thought

    Indian Academy of Sciences (India)

    K. P. MOHANAN

    2017-07-04

    Jul 4, 2017 ... This article seeks to explore the conceptual foundations of evolutionary thought in the physical, biological, and human sciences. Viewing evolution as symmetry breaking, it explores the concepts of change, history, and evolutionary history, and outlines a concept of biological macroevolution.

  14. Evolutionary dynamics in structured populations

    Science.gov (United States)

    Nowak, Martin A.; Tarnita, Corina E.; Antal, Tibor

    2010-01-01

    Evolutionary dynamics shape the living world around us. At the centre of every evolutionary process is a population of reproducing individuals. The structure of that population affects evolutionary dynamics. The individuals can be molecules, cells, viruses, multicellular organisms or humans. Whenever the fitness of individuals depends on the relative abundance of phenotypes in the population, we are in the realm of evolutionary game theory. Evolutionary game theory is a general approach that can describe the competition of species in an ecosystem, the interaction between hosts and parasites, between viruses and cells, and also the spread of ideas and behaviours in the human population. In this perspective, we review the recent advances in evolutionary game dynamics with a particular emphasis on stochastic approaches in finite sized and structured populations. We give simple, fundamental laws that determine how natural selection chooses between competing strategies. We study the well-mixed population, evolutionary graph theory, games in phenotype space and evolutionary set theory. We apply these results to the evolution of cooperation. The mechanism that leads to the evolution of cooperation in these settings could be called ‘spatial selection’: cooperators prevail against defectors by clustering in physical or other spaces. PMID:20008382

  15. Child Development and Evolutionary Psychology.

    Science.gov (United States)

    Bjorklund, David F.; Pellegrini, Anthony D.

    2000-01-01

    Argues that an evolutionary account provides insight into developmental function and individual differences. Outlines some assumptions of evolutionary psychology related to development. Introduces the developmental systems approach, differential influence of natural selection at different points in ontogeny, and development of evolved…

  16. Information theory, evolutionary innovations and evolvability.

    Science.gov (United States)

    Wagner, Andreas

    2017-12-05

    How difficult is it to 'discover' an evolutionary adaptation or innovation? I here suggest that information theory, in combination with high-throughput DNA sequencing, can help answer this question by quantifying a new phenotype's information content. I apply this framework to compute the phenotypic information associated with novel gene regulation and with the ability to use novel carbon sources. The framework can also help quantify how DNA duplications affect evolvability, estimate the complexity of phenotypes and clarify the meaning of 'progress' in Darwinian evolution.This article is part of the themed issue 'Process and pattern in innovations from cells to societies'. © 2017 The Author(s).

  17. Incorporating evolutionary processes into population viability models.

    Science.gov (United States)

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. © 2014 Society for Conservation Biology.

  18. Evolutionary Explanations of Eating Disorders

    Directory of Open Access Journals (Sweden)

    Igor Kardum

    2008-12-01

    Full Text Available This article reviews several most important evolutionary mechanisms that underlie eating disorders. The first part clarifies evolutionary foundations of mental disorders and various mechanisms leading to their development. In the second part selective pressures and evolved adaptations causing contemporary epidemic of obesity as well as differences in dietary regimes and life-style between modern humans and their ancestors are described. Concerning eating disorders, a number of current evolutionary explanations of anorexia nervosa are presented together with their main weaknesses. Evolutionary explanations of eating disorders based on the reproductive suppression hypothesis and its variants derived from kin selection theory and the model of parental manipulation were elaborated. The sexual competition hypothesis of eating disorder, adapted to flee famine hypothesis as well as explanation based on the concept of social attention holding power and the need to belonging were also explained. The importance of evolutionary theory in modern conceptualization and research of eating disorders is emphasized.

  19. A Fast Evolutionary Metaheuristic for the Vehicle Routing Problem with Time Windows

    NARCIS (Netherlands)

    Bräysy, Olli; Dullaert, W.

    2003-01-01

    This paper presents a new evolutionary metaheuristic for the vehicle routing problem with time windows. Ideas on multi-start local search, ejection chains, simulated annealing and evolutionary computation are combined in a heuristic that is both robust and efficient. The proposed method produces

  20. Electronic structure computation and differential capacitance profile in δ-doped FET as a function of hydrostatic pressure

    Energy Technology Data Exchange (ETDEWEB)

    Carlos-Pinedo, C.; Rodríguez-Vargas, I.; Martínez-Orozco, J. C. [Unidad Académica de Física. Universidad Autónoma de Zacatecas. Calzada Solidaridad Esquina con Paseo la Bufa S/N. C.P. 98060, Zacatecas, Zac. (Mexico)

    2014-05-15

    In this work we present the results obtained from the calculation of the level structure of a n-type delta-doped well Field Effect Transistor when is subjected to hydrostatic pressure. We study the energy level structure as a function of hydrostatic pressure within the range of 0 to 6 kbar for different Schottky barrier height (SBH). We use an analytical expression for the effect of hydrostatic pressure on the SBH and the pressure dependence of the basic parameters of the system as the effective mass m(P) and the dielectric constant ε(P) of GaAs. We found that due to the effects of hydrostatic pressure, in addition to electronic level structure alteration, the profile of the differential capacitance per unit area C{sup −2} is affected.