WorldWideScience

Sample records for evolutionary computing electronic

  1. Applications of Evolutionary Computation

    NARCIS (Netherlands)

    Mora, Antonio M.; Squillero, Giovanni; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Smith, Stephen L; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Mora, Antonio M.; Squillero, Giovanni; Jan, Mathieu; Matthias, M; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Esparcia-Alcazar, Anna I; Silva, Sara; Agapitos, Alexandros; Cotta, Carlos; De Falco, Ivanoe; Cioppa, Antonio Della; Diwold, Konrad; Ekart, Aniko; Tarantino, Ernesto; Vega, Francisco Fernandez De; Burelli, Paolo; Sim, Kevin; Cagnoni, Stefano; Simoes, Anabela; Merelo, J.J.; Urquhart, Neil; Haasdijk, Evert; Zhang, Mengjie; Squillero, Giovanni; Eiben, A E; Tettamanzi, Andrea G B; Glette, Kyrre; Rohlfshagen, Philipp; Schaefer, Robert; Caserta, Marco; Ramirez, Adriana; Voß, Stefan

    2015-01-01

    The application of genetic and evolutionary computation to problems in medicine has increased rapidly over the past five years, but there are specific issues and challenges that distinguish it from other real-world applications. Obtaining reliable and coherent patient data, establishing the clinical

  2. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  3. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  4. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,

  5. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  6. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  7. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  8. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  9. Evolutionary computation in zoology and ecology.

    Science.gov (United States)

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  10. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  11. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic. Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with.

  12. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  13. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  14. Evolutionary computation techniques a comparative perspective

    CERN Document Server

    Cuevas, Erik; Oliva, Diego

    2017-01-01

    This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.

  15. Evolutionary Computing Methods for Spectral Retrieval

    Science.gov (United States)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  16. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  17. Optimizing a reconfigurable material via evolutionary computation

    Science.gov (United States)

    Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.

    2015-08-01

    Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.

  18. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  19. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  20. Parallel evolutionary computation in bioinformatics applications.

    Science.gov (United States)

    Pinho, Jorge; Sobral, João Luis; Rocha, Miguel

    2013-05-01

    A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  2. Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Biaobiao Zhang

    2011-01-01

    Full Text Available Neural networks and fuzzy systems are two soft-computing paradigms for system modelling. Adapting a neural or fuzzy system requires to solve two optimization problems: structural optimization and parametric optimization. Structural optimization is a discrete optimization problem which is very hard to solve using conventional optimization techniques. Parametric optimization can be solved using conventional optimization techniques, but the solution may be easily trapped at a bad local optimum. Evolutionary computation is a general-purpose stochastic global optimization approach under the universally accepted neo-Darwinian paradigm, which is a combination of the classical Darwinian evolutionary theory, the selectionism of Weismann, and the genetics of Mendel. Evolutionary algorithms are a major approach to adaptation and optimization. In this paper, we first introduce evolutionary algorithms with emphasis on genetic algorithms and evolutionary strategies. Other evolutionary algorithms such as genetic programming, evolutionary programming, particle swarm optimization, immune algorithm, and ant colony optimization are also described. Some topics pertaining to evolutionary algorithms are also discussed, and a comparison between evolutionary algorithms and simulated annealing is made. Finally, the application of EAs to the learning of neural networks as well as to the structural and parametric adaptations of fuzzy systems is also detailed.

  3. Conversion Rate Optimization through Evolutionary Computation

    OpenAIRE

    Miikkulainen, Risto; Iscoe, Neil; Shagrin, Aaron; Cordell, Ron; Nazari, Sam; Schoolland, Cory; Brundage, Myles; Epstein, Jonathan; Dean, Randy; Lamba, Gurmeet

    2017-01-01

    Conversion optimization means designing a web interface so that as many users as possible take a desired action on it, such as register or purchase. Such design is usually done by hand, testing one change at a time through A/B testing, or a limited number of combinations through multivariate testing, making it possible to evaluate only a small fraction of designs in a vast design space. This paper describes Sentient Ascend, an automatic conversion optimization system that uses evolutionary op...

  4. Evolutionary Cell Computing: From Protocells to Self-Organized Computing

    Science.gov (United States)

    Colombano, Silvano; New, Michael H.; Pohorille, Andrew; Scargle, Jeffrey; Stassinopoulos, Dimitris; Pearson, Mark; Warren, James

    2000-01-01

    On the path from inanimate to animate matter, a key step was the self-organization of molecules into protocells - the earliest ancestors of contemporary cells. Studies of the properties of protocells and the mechanisms by which they maintained themselves and reproduced are an important part of astrobiology. These studies also have the potential to greatly impact research in nanotechnology and computer science. Previous studies of protocells have focussed on self-replication. In these systems, Darwinian evolution occurs through a series of small alterations to functional molecules whose identities are stored. Protocells, however, may have been incapable of such storage. We hypothesize that under such conditions, the replication of functions and their interrelationships, rather than the precise identities of the functional molecules, is sufficient for survival and evolution. This process is called non-genomic evolution. Recent breakthroughs in experimental protein chemistry have opened the gates for experimental tests of non-genomic evolution. On the basis of these achievements, we have developed a stochastic model for examining the evolutionary potential of non-genomic systems. In this model, the formation and destruction (hydrolysis) of bonds joining amino acids in proteins occur through catalyzed, albeit possibly inefficient, pathways. Each protein can act as a substrate for polymerization or hydrolysis, or as a catalyst of these chemical reactions. When a protein is hydrolyzed to form two new proteins, or two proteins are joined into a single protein, the catalytic abilities of the product proteins are related to the catalytic abilities of the reactants. We will demonstrate that the catalytic capabilities of such a system can increase. Its evolutionary potential is dependent upon the competition between the formation of bond-forming and bond-cutting catalysts. The degree to which hydrolysis preferentially affects bonds in less efficient, and therefore less well

  5. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  6. Evolutionary computing in Nuclear Engineering Institute/CNEN-Brazil

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.; Lapa, Nelbia da Silva; Mol, Antonio C.

    2000-01-01

    This paper aims to discuss the importance of evolutionary computation (CE) for nuclear engineering and the development of this area in the Instituto de Engenharia Nuclear (IEN) at the last years. Are describe, briefly, the applications realized in this institute by the technical group of CE. For example: nuclear reactor core design optimization, preventive maintenance scheduling optimizing and nuclear reactor transient identifications. It is also shown a novel computational tool to implementation of genetic algorithm that was development in this institute and applied in those works. Some results were presents and the gains obtained with the evolutionary computation were discussing. (author)

  7. 10th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Wang, Chia-Hung; Jiang, Xin

    2017-01-01

    This book gathers papers presented at the 10th International Conference on Genetic and Evolutionary Computing (ICGEC 2016). The conference was co-sponsored by Springer, Fujian University of Technology in China, the University of Computer Studies in Yangon, University of Miyazaki in Japan, National Kaohsiung University of Applied Sciences in Taiwan, Taiwan Association for Web Intelligence Consortium, and VSB-Technical University of Ostrava, Czech Republic. The ICGEC 2016, which was held from November 7 to 9, 2016 in Fuzhou City, China, was intended as an international forum for researchers and professionals in all areas of genetic and evolutionary computing.

  8. From evolutionary computation to the evolution of things

    NARCIS (Netherlands)

    Eiben, A.E.; Smith, J.E.

    2015-01-01

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as

  9. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    Over the last ten years, there have been numerous applications of evolutionary algorithms to a variety of scheduling problems. Like most other research on heuristic scheduling, the primary aim of the research has been on deterministic formulations of the problems. This is in contrast to real world...... scheduling problems which are usually not deterministic. Usually at the time the schedule is made some information about the problem and processing environment is available, but this information is uncertain and likely to change during schedule execution. Changes frequently encountered in scheduling...... environments include machine breakdowns, uncertain processing times, workers getting sick, materials being delayed and the appearance of new jobs. These possible environmental changes mean that a schedule which was optimal for the information available at the time of scheduling can end up being highly...

  10. Evolutionary Computation Techniques for Predicting Atmospheric Corrosion

    Directory of Open Access Journals (Sweden)

    Amine Marref

    2013-01-01

    Full Text Available Corrosion occurs in many engineering structures such as bridges, pipelines, and refineries and leads to the destruction of materials in a gradual manner and thus shortening their lifespan. It is therefore crucial to assess the structural integrity of engineering structures which are approaching or exceeding their designed lifespan in order to ensure their correct functioning, for example, carrying ability and safety. An understanding of corrosion and an ability to predict corrosion rate of a material in a particular environment plays a vital role in evaluating the residual life of the material. In this paper we investigate the use of genetic programming and genetic algorithms in the derivation of corrosion-rate expressions for steel and zinc. Genetic programming is used to automatically evolve corrosion-rate expressions while a genetic algorithm is used to evolve the parameters of an already engineered corrosion-rate expression. We show that both evolutionary techniques yield corrosion-rate expressions that have good accuracy.

  11. Coevolution of Artificial Agents Using Evolutionary Computation in Bargaining Game

    Directory of Open Access Journals (Sweden)

    Sangwook Lee

    2015-01-01

    Full Text Available Analysis of bargaining game using evolutionary computation is essential issue in the field of game theory. This paper investigates the interaction and coevolutionary process among heterogeneous artificial agents using evolutionary computation (EC in the bargaining game. In particular, the game performance with regard to payoff through the interaction and coevolution of agents is studied. We present three kinds of EC based agents (EC-agent participating in the bargaining game: genetic algorithm (GA, particle swarm optimization (PSO, and differential evolution (DE. The agents’ performance with regard to changing condition is compared. From the simulation results it is found that the PSO-agent is superior to the other agents.

  12. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  13. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  14. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  15. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  16. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design

    International Nuclear Information System (INIS)

    Menges, Achim

    2012-01-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies. (paper)

  17. Regulatory RNA design through evolutionary computation and strand displacement.

    Science.gov (United States)

    Rostain, William; Landrain, Thomas E; Rodrigo, Guillermo; Jaramillo, Alfonso

    2015-01-01

    The discovery and study of a vast number of regulatory RNAs in all kingdoms of life over the past decades has allowed the design of new synthetic RNAs that can regulate gene expression in vivo. Riboregulators, in particular, have been used to activate or repress gene expression. However, to accelerate and scale up the design process, synthetic biologists require computer-assisted design tools, without which riboregulator engineering will remain a case-by-case design process requiring expert attention. Recently, the design of RNA circuits by evolutionary computation and adapting strand displacement techniques from nanotechnology has proven to be suited to the automated generation of DNA sequences implementing regulatory RNA systems in bacteria. Herein, we present our method to carry out such evolutionary design and how to use it to create various types of riboregulators, allowing the systematic de novo design of genetic control systems in synthetic biology.

  18. Electronics and computer

    International Nuclear Information System (INIS)

    Asano, Yuzo

    1980-01-01

    The requirement for the data collection and handling system of TRISTAN is discussed. In April, 1979, the first general meeting was held at KEK to organize the workshop on the future electronics for large scale, high energy experiments. Three sub-groups were formed, and those are the Group 1 for the study of fast logics, the Group 2 for the pre-processing and temporary storage of data, and the Group 3 for the data acquisition system. The general trends of the future system are the reduction of data size and the reduction of trigger rate. The important points for processing the fast data are fast block transfer, parallel processing and pre-processing. The U.S. Fast System Design Group has proposed some features for the future system called Fastbus. The Time Projection Chamber proposed for a PEP Facility gives a typical example of the future detectors for colliding beam machines. It is a large drift chamber in a solenoidal magnetic field. The method of data processing is interesting. By extrapolating the past experiences, the requirements for the host computer for the data acquisition system can be made. (Kato, T.)

  19. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  20. 9th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro; Genetic and Evolutionary Computing

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2015, the 9th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Ministry of Science and Technology, Myanmar, University of Computer Studies, Yangon, University of Miyazaki in Japan, Kaohsiung University of Applied Science in Taiwan, Fujian University of Technology in China and VSB-Technical University of Ostrava. ICGEC 2015 is held from 26-28, August, 2015 in Yangon, Myanmar. Yangon, the most multiethnic and cosmopolitan city in Myanmar, is the main gateway to the country. Despite being the commercial capital of Myanmar, Yangon is a city engulfed by its rich history and culture, an integration of ancient traditions and spiritual heritage. The stunning SHWEDAGON Pagoda is the center piece of Yangon city, which itself is famous for the best British colonial era architecture. Of particular interest in many shops of Bogyoke Aung San Market,...

  1. 8th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Yang, Chin-Yu; Lin, Chun-Wei; Pan, Jeng-Shyang; Snasel, Vaclav; Abraham, Ajith

    2015-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2014, the 8th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Nanchang Institute of Technology in China, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2014 is held from 18-20 October 2014 in Nanchang, China. Nanchang is one of is the capital of Jiangxi Province in southeastern China, located in the north-central portion of the province. As it is bounded on the west by the Jiuling Mountains, and on the east by Poyang Lake, it is famous for its scenery, rich history and cultural sites. Because of its central location relative to the Yangtze and Pearl River Delta regions, it is a major railroad hub in Southern China. The conference is intended as an international forum for the researchers and professionals in all areas of genetic and evolutionary computing.

  2. 7th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Krömer, Pavel; Snášel, Václav

    2014-01-01

    Genetic and Evolutionary Computing This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2013, the 7th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by The Waseda University in Japan, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2013 was held in Prague, Czech Republic. Prague is one of the most beautiful cities in the world whose magical atmosphere has been shaped over ten centuries. Places of the greatest tourist interest are on the Royal Route running from the Powder Tower through Celetna Street to Old Town Square, then across Charles Bridge through the Lesser Town up to the Hradcany Castle. One should not miss the Jewish Town, and the National Gallery with its fine collection of Czech Gothic art, collection of old European art, and a beautiful collection of French art. The conference was intended as an international forum for the res...

  3. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  4. Introduction to electronic analogue computers

    CERN Document Server

    Wass, C A A

    1965-01-01

    Introduction to Electronic Analogue Computers, Second Revised Edition is based on the ideas and experience of a group of workers at the Royal Aircraft Establishment, Farnborough, Hants. This edition is almost entirely the work of Mr. K. C. Garner, of the College of Aeronautics, Cranfield. As various advances have been made in the technology involving electronic analogue computers, this book presents discussions on the said progress, including some acquaintance with the capabilities of electronic circuits and equipment. This text also provides a mathematical background including simple differen

  5. Protein 3D structure computed from evolutionary sequence variation.

    Directory of Open Access Journals (Sweden)

    Debora S Marks

    Full Text Available The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing.In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy.We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues, including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7-4.8 Å C(α-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org. This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of

  6. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  7. Optimization and Assessment of Wavelet Packet Decompositions with Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Schell Thomas

    2003-01-01

    Full Text Available In image compression, the wavelet transformation is a state-of-the-art component. Recently, wavelet packet decomposition has received quite an interest. A popular approach for wavelet packet decomposition is the near-best-basis algorithm using nonadditive cost functions. In contrast to additive cost functions, the wavelet packet decomposition of the near-best-basis algorithm is only suboptimal. We apply methods from the field of evolutionary computation (EC to test the quality of the near-best-basis results. We observe a phenomenon: the results of the near-best-basis algorithm are inferior in terms of cost-function optimization but are superior in terms of rate/distortion performance compared to EC methods.

  8. Computer electronics made simple computerbooks

    CERN Document Server

    Bourdillon, J F B

    1975-01-01

    Computer Electronics: Made Simple Computerbooks presents the basics of computer electronics and explains how a microprocessor works. Various types of PROMs, static RAMs, dynamic RAMs, floppy disks, and hard disks are considered, along with microprocessor support devices made by Intel, Motorola and Zilog. Bit slice logic and some AMD bit slice products are also described. Comprised of 14 chapters, this book begins with an introduction to the fundamentals of hardware design, followed by a discussion on the basic building blocks of hardware (NAND, NOR, AND, OR, NOT, XOR); tools and equipment that

  9. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  10. Electronic Computer Originated Mail Service

    Science.gov (United States)

    Seto, Takao

    Electronic mail originated by computer is exactly a new communication media which is a product of combining traditional mailing with electrical communication. Experimental service of this type of mailing started in June 10, 1985 at Ministry of Posts and Telecommunications. Its location in various communication media, its comparison with facsimile type electronic mailing, and status quo of electronic mailing in foreign countries are described. Then, this mailing is briefed centering around the system organization and the services. Additional services to be executed in near future are also mentioned.

  11. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  12. Computing the Quartet Distance Between Evolutionary Trees in Time O(n log n)

    DEFF Research Database (Denmark)

    Brodal, Gerth Sølfting; Fagerberg, Rolf; Pedersen, Christian Nørgaard Storm

    2003-01-01

    Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, McMorris, and ...... unrooted evolutionary trees of n species, where all internal nodes have degree three, in time O(n log n. The previous best algorithm for the problem uses time O(n 2).......Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, Mc......Morris, and Meacham. The quartet distance between two unrooted evolutionary trees is the number of quartet topology differences between the two trees, where a quartet topology is the topological subtree induced by four species. In this paper we present an algorithm for computing the quartet distance between two...

  13. Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2011-01-01

    Drift analysis is a powerful tool used to bound the optimization time of evolutionary algorithms (EAs). Various previous works apply a drift theorem going back to Hajek in order to show exponential lower bounds on the optimization time of EAs. However, this drift theorem is tedious to read...... and to apply since it requires two bounds on the moment-generating (exponential) function of the drift. A recent work identifies a specialization of this drift theorem that is much easier to apply. Nevertheless, it is not as simple and not as general as possible. The present paper picks up Hajek’s line...

  14. Practical Applications of Evolutionary Computation to Financial Engineering Robust Techniques for Forecasting, Trading and Hedging

    CERN Document Server

    Iba, Hitoshi

    2012-01-01

    “Practical Applications of Evolutionary Computation to Financial Engineering” presents the state of the art techniques in Financial Engineering using recent results in Machine Learning and Evolutionary Computation. This book bridges the gap between academics in computer science and traders and explains the basic ideas of the proposed systems and the financial problems in ways that can be understood by readers without previous knowledge on either of the fields. To cement the ideas discussed in the book, software packages are offered that implement the systems described within. The book is structured so that each chapter can be read independently from the others. Chapters 1 and 2 describe evolutionary computation. The third chapter is an introduction to financial engineering problems for readers who are unfamiliar with this area. The following chapters each deal, in turn, with a different problem in the financial engineering field describing each problem in detail and focusing on solutions based on evolutio...

  15. Have Computer, Will Not Travel: Meeting Electronically.

    Science.gov (United States)

    Kurland, Norman D.

    1983-01-01

    Beginning with two different scenarios depicting a face-to-face conference on the one hand and, on the other, a computer or electronic conference, the author argues the advantages of electronic conferencing and describes some of its uses. (JBM)

  16. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  17. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  18. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  19. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  20. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  1. An Analog Computer for Electronic Engineering Education

    Science.gov (United States)

    Fitch, A. L.; Iu, H. H. C.; Lu, D. D. C.

    2011-01-01

    This paper describes a compact analog computer and proposes its use in electronic engineering teaching laboratories to develop student understanding of applications in analog electronics, electronic components, engineering mathematics, control engineering, safe laboratory and workshop practices, circuit construction, testing, and maintenance. The…

  2. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  3. Investigating the Multi-memetic Mind Evolutionary Computation Algorithm Efficiency

    Directory of Open Access Journals (Sweden)

    M. K. Sakharov

    2017-01-01

    Full Text Available In solving practically significant problems of global optimization, the objective function is often of high dimensionality and computational complexity and of nontrivial landscape as well. Studies show that often one optimization method is not enough for solving such problems efficiently - hybridization of several optimization methods is necessary.One of the most promising contemporary trends in this field are memetic algorithms (MA, which can be viewed as a combination of the population-based search for a global optimum and the procedures for a local refinement of solutions (memes, provided by a synergy. Since there are relatively few theoretical studies concerning the MA configuration, which is advisable for use to solve the black-box optimization problems, many researchers tend just to adaptive algorithms, which for search select the most efficient methods of local optimization for the certain domains of the search space.The article proposes a multi-memetic modification of a simple SMEC algorithm, using random hyper-heuristics. Presents the software algorithm and memes used (Nelder-Mead method, method of random hyper-sphere surface search, Hooke-Jeeves method. Conducts a comparative study of the efficiency of the proposed algorithm depending on the set and the number of memes. The study has been carried out using Rastrigin, Rosenbrock, and Zakharov multidimensional test functions. Computational experiments have been carried out for all possible combinations of memes and for each meme individually.According to results of study, conducted by the multi-start method, the combinations of memes, comprising the Hooke-Jeeves method, were successful. These results prove a rapid convergence of the method to a local optimum in comparison with other memes, since all methods perform the fixed number of iterations at the most.The analysis of the average number of iterations shows that using the most efficient sets of memes allows us to find the optimal

  4. Computational Nanotechnology Molecular Electronics, Materials and Machines

    Science.gov (United States)

    Srivastava, Deepak; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    This presentation covers research being performed on computational nanotechnology, carbon nanotubes and fullerenes at the NASA Ames Research Center. Topics cover include: nanomechanics of nanomaterials, nanotubes and composite materials, molecular electronics with nanotube junctions, kinky chemistry, and nanotechnology for solid-state quantum computers using fullerenes.

  5. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  6. A cyber kill chain based taxonomy of banking Trojans for evolutionary computational intelligence

    OpenAIRE

    Kiwia, D; Dehghantanha, A; Choo, K-KR; Slaughter, J

    2017-01-01

    Malware such as banking Trojans are popular with financially-motivated cybercriminals. Detection of banking Trojans remains a challenging task, due to the constant evolution of techniques used to obfuscate and circumvent existing detection and security solutions. Having a malware taxonomy can facilitate the design of mitigation strategies such as those based on evolutionary computational intelligence. Specifically, in this paper, we propose a cyber kill chain based taxonomy of banking Trojans...

  7. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Vijayakumar, K; Panigrahi, Bijaya; Das, Swagatam

    2017-01-01

    The volume is a collection of high-quality peer-reviewed research papers presented in the International Conference on Artificial Intelligence and Evolutionary Computation in Engineering Systems (ICAIECES 2016) held at SRM University, Chennai, Tamilnadu, India. This conference is an international forum for industry professionals and researchers to deliberate and state their research findings, discuss the latest advancements and explore the future directions in the emerging areas of engineering and technology. The book presents original work and novel ideas, information, techniques and applications in the field of communication, computing and power technologies.

  8. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Bhaskar, M; Panigrahi, Bijaya; Das, Swagatam

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems (ICAIECES -2015) held at Velammal Engineering College (VEC), Chennai, India during 22 – 23 April 2015. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academic and industry present their original work and exchange ideas, information, techniques and applications in the field of Communication, Computing and Power Technologies.

  9. Computational methods of electron/photon transport

    International Nuclear Information System (INIS)

    Mack, J.M.

    1983-01-01

    A review of computational methods simulating the non-plasma transport of electrons and their attendant cascades is presented. Remarks are mainly restricted to linearized formalisms at electron energies above 1 keV. The effectiveness of various metods is discussed including moments, point-kernel, invariant imbedding, discrete-ordinates, and Monte Carlo. Future research directions and the potential impact on various aspects of science and engineering are indicated

  10. Artificial Intelligence, Evolutionary Computing and Metaheuristics In the Footsteps of Alan Turing

    CERN Document Server

    2013-01-01

    Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation.  Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo sear...

  11. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  12. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Directory of Open Access Journals (Sweden)

    Maciej J Mrowinski

    Full Text Available With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy. Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  13. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.

    Science.gov (United States)

    Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra

    2018-01-04

    Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Multi-objective optimization of HVAC system with an evolutionary computation algorithm

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Tang, Fan; Xu, Guanglin

    2011-01-01

    A data-mining approach for the optimization of a HVAC (heating, ventilation, and air conditioning) system is presented. A predictive model of the HVAC system is derived by data-mining algorithms, using a dataset collected from an experiment conducted at a research facility. To minimize the energy while maintaining the corresponding IAQ (indoor air quality) within a user-defined range, a multi-objective optimization model is developed. The solutions of this model are set points of the control system derived with an evolutionary computation algorithm. The controllable input variables - supply air temperature and supply air duct static pressure set points - are generated to reduce the energy use. The results produced by the evolutionary computation algorithm show that the control strategy saves energy by optimizing operations of an HVAC system. -- Highlights: → A data-mining approach for the optimization of a heating, ventilation, and air conditioning (HVAC) system is presented. → The data used in the project has been collected from an experiment conducted at an energy research facility. → The approach presented in the paper leads to accomplishing significant energy savings without compromising the indoor air quality. → The energy savings are accomplished by computing set points for the supply air temperature and the supply air duct static pressure.

  15. Solution of Fractional Order System of Bagley-Torvik Equation Using Evolutionary Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Zahoor Raja

    2011-01-01

    Full Text Available A stochastic technique has been developed for the solution of fractional order system represented by Bagley-Torvik equation. The mathematical model of the equation was developed with the help of feed-forward artificial neural networks. The training of the networks was made with evolutionary computational intelligence based on genetic algorithm hybrid with pattern search technique. Designed scheme was successfully applied to different forms of the equation. Results are compared with standard approximate analytic, stochastic numerical solvers and exact solutions.

  16. An evolutionary computing frame work toward object extraction from satellite images

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Image interpretation domains have witnessed the application of many intelligent methodologies over the past decade; however the effective use of evolutionary computing techniques for feature detection has been less explored. In this paper, we critically analyze the possibility of using cellular neural network for accurate feature detection. Contextual knowledge has been effectively represented by incorporating spectral and spatial aspects using adaptive kernel strategy. Developed methodology has been compared with traditional approaches in an object based context and investigations revealed that considerable success has been achieved with the procedure. Intelligent interpretation, automatic interpolation, and effective contextual representations are the features of the system.

  17. Neuro-Inspired Computing with Stochastic Electronics

    KAUST Repository

    Naous, Rawan

    2016-01-06

    The extensive scaling and integration within electronic systems have set the standards for what is addressed to as stochastic electronics. The individual components are increasingly diverting away from their reliable behavior and producing un-deterministic outputs. This stochastic operation highly mimics the biological medium within the brain. Hence, building on the inherent variability, particularly within novel non-volatile memory technologies, paves the way for unconventional neuromorphic designs. Neuro-inspired networks with brain-like structures of neurons and synapses allow for computations and levels of learning for diverse recognition tasks and applications.

  18. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  19. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  20. Discovering Unique, Low-Energy Transition States Using Evolutionary Molecular Memetic Computing

    DEFF Research Database (Denmark)

    Ellabaan, Mostafa M Hashim; Ong, Y.S.; Handoko, S.D.

    2013-01-01

    In the last few decades, identification of transition states has experienced significant growth in research interests from various scientific communities. As per the transition states theory, reaction paths and landscape analysis as well as many thermodynamic properties of biochemical systems can...... be accurately identified through the transition states. Transition states describe the paths of molecular systems in transiting across stable states. In this article, we present the discovery of unique, low-energy transition states and showcase the efficacy of their identification using the memetic computing...... paradigm under a Molecular Memetic Computing (MMC) framework. In essence, the MMC is equipped with the tree-based representation of non-cyclic molecules and the covalent-bond-driven evolutionary operators, in addition to the typical backbone of memetic algorithms. Herein, we employ genetic algorithm...

  1. A computer simulation of auger electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Ragheb, M S; Bakr, M H.S. [Dept. Of Accellerators and Ion Sources, Division of Basic Nuclear Sciences, NRC, Atomic Energy Authority, (Egypt)

    1997-12-31

    A simulation study of Auger electron spectroscopy was performed to reveal how far the dependency between the different parameters governing the experimental behavior affects the peaks. The experimental procedure followed by the AC modulation technique were reproduced by means of a computer program. It generates the assumed output Auger electron peaks, exposes them to a retarding AC modulated field and collects the resulting modulated signals. The program produces the lock-in treatment in order to demodulate the signals revealing the Auger peaks. It analyzes the spectrum obtained giving the peak positions and energies. Comparison between results of simulation and the experimental data showed good agreement. The peaks of the spectrum obtained depend upon the amplitude, frequency and resolution of the applied modulated signal. The peak shape is effected by the rise time, the slope and the starting potential of the retarding field. 4 figs.

  2. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  3. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  4. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    International Nuclear Information System (INIS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-01-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic

  5. Towards a Population Dynamics Theory for Evolutionary Computing: Learning from Biological Population Dynamics in Nature

    Science.gov (United States)

    Ma, Zhanshan (Sam)

    In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three

  6. Evolutionary developments in x ray and electron energy loss microanalysis instrumentation for the analytical electron microscope

    Science.gov (United States)

    Zaluzec, Nester J.

    Developments in instrumentation for both X ray Dispersive and Electron Energy Loss Spectroscopy (XEDS/EELS) over the last ten years have given the experimentalist a greatly enhanced set of analytical tools for characterization. Microanalysts have waited for nearly two decades now in the hope of getting a true analytical microscope and the development of 300 to 400 kV instruments should have allowed us to attain this goal. Unfortunately, this has not generally been the case. While there have been some major improvements in the techniques, there has also been some devolution in the modern AEM (Analytical Electron Microscope). In XEDS, the majority of today's instruments are still plagued by the hole count effect, which was first described in detail over fifteen years ago. The magnitude of this problem can still reach the 20 percent level for medium atomic number species in a conventional off-the-shelf intermediate voltage AEM. This is an absurd situation and the manufacturers should be severely criticized. Part of the blame, however, also rests on the AEM community for not having come up with a universally agreed upon standard test procedure. Fortunately, such a test procedure is in the early stages of refinement. The proposed test specimen consists of an evaporated Cr film approx. 500 to 1000A thick supported upon a 3mm diameter Molybdenum 200 micron aperture.

  7. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  8. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    Science.gov (United States)

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  9. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  10. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    Science.gov (United States)

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  11. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  12. A computer code package for electron transport Monte Carlo simulation

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    1999-01-01

    A computer code package was developed for solving various electron transport problems by Monte Carlo simulation. It is based on condensed history Monte Carlo algorithm. In order to get reliable results over wide ranges of electron energies and target atomic numbers, specific techniques of electron transport were implemented such as: Moliere multiscatter angular distributions, Blunck-Leisegang multiscatter energy distribution, sampling of electron-electron and Bremsstrahlung individual interactions. Path-length and lateral displacement corrections algorithms and the module for computing collision, radiative and total restricted stopping powers and ranges of electrons are also included. Comparisons of simulation results with experimental measurements are finally presented. (author)

  13. Electronic Mail for Personal Computers: Development Issues.

    Science.gov (United States)

    Tomer, Christinger

    1994-01-01

    Examines competing, commercially developed electronic mail programs and how these technologies will affect the functionality and quality of electronic mail. How new standards for client-server mail systems are likely to enhance messaging capabilities and the use of electronic mail for information retrieval are considered. (Contains eight…

  14. An Artificial Immune System-Inspired Multiobjective Evolutionary Algorithm with Application to the Detection of Distributed Computer Network Intrusions

    Science.gov (United States)

    2007-03-01

    Optimization Coello, Van Veldhuizen , and Lamont define global optimization as, “the process of finding the global minimum4 within some search space S [CVL02...Technology, Shapes Markets, and Manages People, Simon & Schuster, New York, 1995. [CVL02] Coello, C., Van Veldhuizen , D., Lamont, G.B., Evolutionary...Anomaly Detection, Technical Report CS- 2003-02, Computer Science Department, Florida Institute of Technology, 2003. [Marmelstein99] Marmelstein, R., Van

  15. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.; Martinez B, M. R.; Gallego, E.

    2009-10-01

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  16. Computer conferencing: the "nurse" in the "Electronic School District".

    Science.gov (United States)

    Billings, D M; Phillips, A

    1991-01-01

    As computer-based instructional technologies become increasingly available, they offer new mechanisms for health educators to provide health instruction. This article describes a pilot project in which nurses established a computer conference to provide health instruction to high school students participating in an electronic link of high schools. The article discusses computer conferencing, the "Electronic School District," the design of the nursing conference, and the role of the nurse in distributed health education.

  17. A hybrid finite element analysis and evolutionary computation method for the design of lightweight lattice components with optimized strut diameter

    DEFF Research Database (Denmark)

    Salonitis, Konstantinos; Chantzis, Dimitrios; Kappatos, Vasileios

    2017-01-01

    approaches or with the use of topology optimization methodologies. An optimization approach utilizing multipurpose optimization algorithms has not been proposed yet. This paper presents a novel user-friendly method for the design optimization of lattice components towards weight minimization, which combines...... finite element analysis and evolutionary computation. The proposed method utilizes the cell homogenization technique in order to reduce the computational cost of the finite element analysis and a genetic algorithm in order to search for the most lightweight lattice configuration. A bracket consisting...

  18. Noninvasive coronary angioscopy using electron beam computed tomography and multidetector computed tomography

    NARCIS (Netherlands)

    van Ooijen, PMA; Nieman, K; de Feyter, PJ; Oudkerk, M

    2002-01-01

    With the advent of noninvasive coronary imaging techniques like multidetector computed tomography and electron beam computed tomography, new representation methods such as intracoronary visualization. have been introduced. We explore the possibilities of these novel visualization techniques and

  19. Electronic digital computers their use in science and engineering

    CERN Document Server

    Alt, Franz L

    1958-01-01

    Electronic Digital Computers: Their Use in Science and Engineering describes the principles underlying computer design and operation. This book describes the various applications of computers, the stages involved in using them, and their limitations. The machine is composed of the hardware which is run by a program. This text describes the use of magnetic drum for storage of data and some computing. The functions and components of the computer include automatic control, memory, input of instructions by using punched cards, and output from resulting information. Computers operate by using numbe

  20. National electronic medical records integration on cloud computing system.

    Science.gov (United States)

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  1. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  2. Quantum Computing with an Electron Spin Ensemble

    DEFF Research Database (Denmark)

    Wesenberg, Janus; Ardavan, A.; Briggs, G.A.D.

    2009-01-01

    We propose to encode a register of quantum bits in different collective electron spin wave excitations in a solid medium. Coupling to spins is enabled by locating them in the vicinity of a superconducting transmission line cavity, and making use of their strong collective coupling to the quantized...

  3. Resolution Versus Error for Computational Electron Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Luzi, Lorenzo; Stevens, Andrew; Yang, Hao; Browning, Nigel D.

    2017-07-01

    Images that are collected via scanning transmission electron microscopy (STEM) can be undersampled to avoid damage to the specimen while maintaining resolution [1, 2]. We have used BPFA to impute missing data and reduce noise [3]. The reconstruction is typically evaluated using the peak signal-to-noise ratio (PSNR). This measure is too conservative for STEM images and we propose that the Fourier ring correlation (FRC) is used instead to evaluate the reconstruction. We are not concerned with exact reconstruction of the truth image, and therefore PSNR is a conservative estimation of the quality of the reconstruction. Instead, we are concerned with the visual resolution of the image and whether atoms can be distinguished. We have evaluated the reconstruction of a simulated STEM image using the FRC and compared the results with the PSNR measurements. The FRC captures the resolution of the image and is not affected by a large MSE if the atom peaks are still distinguishable. The noisy and reconstructed images are shown in Figure 1. The simulated STEM image was sampled at 100%, 80%, 40%, and 20% of the original pixels to simulate an undersampled scan. The reconstruction was done using BPFA with a patch size of 10 x 10 and no overlapping patches. Not having overlapping patches produces inferior results but they are still acceptable. The dictionary size is 64 and 30 iterations were completed during each reconstruction. The 100% image was denoised instead of reconstructed. Poisson noise was applied to the simulated image with λ values of 500, 50, and 5 to simulate lower imaging dose. The original simulated STEM image was also included in our calculations and was generated using a dose of 1000. The simulated STEM image is 100 by 100 pixels and has essentially no high frequency components. The image reconstruction tends to smooth the data, also resulting in no high frequency components. This causes the FRC of the two images to be large at higher resolutions and may be

  4. Fault tolerant embedded computers and power electronics for nuclear robotics

    International Nuclear Information System (INIS)

    Giraud, A.; Robiolle, M.

    1995-01-01

    For requirements of nuclear industries, it is necessary to use embedded rad-tolerant electronics and high-level safety. In this paper, we first describe a computer architecture called MICADO designed for French nuclear industry. We then present outgoing projects on our industry. A special point is made on power electronics for remote-operated and legged robots. (authors). 7 refs., 2 figs

  5. Electron Gun for Computer-controlled Welding of Small Components

    Czech Academy of Sciences Publication Activity Database

    Dupák, Jan; Vlček, Ivan; Zobač, Martin

    2001-01-01

    Roč. 62, 2-3 (2001), s. 159-164 ISSN 0042-207X R&D Projects: GA AV ČR IBS2065015 Institutional research plan: CEZ:AV0Z2065902 Keywords : Electron beam-welding machine * Electron gun * Computer- control led beam Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.541, year: 2001

  6. Computation of the average energy for LXY electrons

    International Nuclear Information System (INIS)

    Grau Carles, A.; Grau, A.

    1996-01-01

    The application of an atomic rearrangement model in which we only consider the three shells K, L and M, to compute the counting efficiency for electron capture nuclides, requires a fine averaged energy value for LMN electrons. In this report, we illustrate the procedure with two example, ''125 I and ''109 Cd. (Author) 4 refs

  7. Fault tolerant embedded computers and power electronics for nuclear robotics

    Energy Technology Data Exchange (ETDEWEB)

    Giraud, A.; Robiolle, M.

    1995-12-31

    For requirements of nuclear industries, it is necessary to use embedded rad-tolerant electronics and high-level safety. In this paper, we first describe a computer architecture called MICADO designed for French nuclear industry. We then present outgoing projects on our industry. A special point is made on power electronics for remote-operated and legged robots. (authors). 7 refs., 2 figs.

  8. At the crossroads of evolutionary computation and music: self-programming synthesizers, swarm orchestras and the origins of melody.

    Science.gov (United States)

    Miranda, Eduardo Reck

    2004-01-01

    This paper introduces three approaches to using Evolutionary Computation (EC) in Music (namely, engineering, creative and musicological approaches) and discusses examples of representative systems that have been developed within the last decade, with emphasis on more recent and innovative works. We begin by reviewing engineering applications of EC in Music Technology such as Genetic Algorithms and Cellular Automata sound synthesis, followed by an introduction to applications where EC has been used to generate musical compositions. Next, we introduce ongoing research into EC models to study the origins of music and detail our own research work on modelling the evolution of melody. Copryright 2004 Massachusetts Institute of Technology

  9. A Multi Agent System for Flow-Based Intrusion Detection Using Reputation and Evolutionary Computation

    Science.gov (United States)

    2011-03-01

    pertinent example of the application of Evolutionary Algorithms to pattern recognition comes from Radtke et al. [130]. The authors apply Multi- Objective...J., T. Zseby, and B. Claise. S. Zander,” Requirements for IP Flow Information Export (IPFIX). Technical report, RFC 3917, October 2004. [130] Radtke ...hal.inria.fr/inria-00104200/en/. [131] Radtke , P.V.W., T. Wong, and R. Sabourin. “A multi-objective memetic al- gorithm for intelligent feature extraction

  10. Evolutionary optimization of neural networks with heterogeneous computation: study and implementation

    OpenAIRE

    FE, JORGE DEOLINDO; Aliaga Varea, Ramón José; Gadea Gironés, Rafael

    2015-01-01

    In the optimization of artificial neural networks (ANNs) via evolutionary algorithms and the implementation of the necessary training for the objective function, there is often a trade-off between efficiency and flexibility. Pure software solutions on general-purpose processors tend to be slow because they do not take advantage of the inherent parallelism, whereas hardware realizations usually rely on optimizations that reduce the range of applicable network topologies, or they...

  11. Calculation and construction of electron-diffraction photographs using computer

    International Nuclear Information System (INIS)

    Khayurov, S.S.; Notkin, A.B.

    1981-01-01

    A method of computer construction and indexing of theoretical electronograms for monophase structures with arbitrary type of crystal lattice and for polyphase ones with known orientational coorrelations between phases is presented. Electron-diffraction photograph is presented, obtained from the foil area of two-phase VT22 alloy at β phase orientation in comparison with theoretical electron-diffraction photographs, built ap by computer, with the [100] β phase zone axis and with three variants of α phase orientation relatively to β phase. It is shown that on the experimental electron-diffraction photograph simultaneously presents α-phase three orientations, which reflexes can be indexing correctly [ru

  12. A computer-controlled conformal radiotherapy system. IV: Electronic chart

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; McShan, Daniel L.; Matrone, Gwynne M.; Weaver, Tamar A.; Lewis, James D.; Kessler, Marc L.

    1995-01-01

    Purpose: The design and implementation of a system for electronically tracking relevant plan, prescription, and treatment data for computer-controlled conformal radiation therapy is described. Methods and Materials: The electronic charting system is implemented on a computer cluster coupled by high-speed networks to computer-controlled therapy machines. A methodical approach to the specification and design of an integrated solution has been used in developing the system. The electronic chart system is designed to allow identification and access of patient-specific data including treatment-planning data, treatment prescription information, and charting of doses. An in-house developed database system is used to provide an integrated approach to the database requirements of the design. A hierarchy of databases is used for both centralization and distribution of the treatment data for specific treatment machines. Results: The basic electronic database system has been implemented and has been in use since July 1993. The system has been used to download and manage treatment data on all patients treated on our first fully computer-controlled treatment machine. To date, electronic dose charting functions have not been fully implemented clinically, requiring the continued use of paper charting for dose tracking. Conclusions: The routine clinical application of complex computer-controlled conformal treatment procedures requires the management of large quantities of information for describing and tracking treatments. An integrated and comprehensive approach to this problem has led to a full electronic chart for conformal radiation therapy treatments

  13. ELECTRONIC ANALOG COMPUTER FOR DETERMINING RADIOACTIVE DISINTEGRATION

    Science.gov (United States)

    Robinson, H.P.

    1959-07-14

    A computer is presented for determining growth and decay curves for elements in a radioactive disintegration series wherein one unstable element decays to form a second unstable element or isotope, which in turn forms a third element, etc. The growth and decay curves of radioactive elements are simulated by the charge and discharge curves of a resistance-capacitance network. Several such networks having readily adjustable values are connected in series with an amplifier between each successive pair. The time constant of each of the various networks is set proportional to the half-life of a corresponding element in the series represented and the charge and discharge curves of each of the networks simulates the element growth and decay curve.

  14. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  15. Application of Neural Network Optimized by Mind Evolutionary Computation in Building Energy Prediction

    Science.gov (United States)

    Song, Chen; Zhong-Cheng, Wu; Hong, Lv

    2018-03-01

    Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.

  16. Electricity demand and spot price forecasting using evolutionary computation combined with chaotic nonlinear dynamic model

    International Nuclear Information System (INIS)

    Unsihuay-Vila, C.; Zambroni de Souza, A.C.; Marangon-Lima, J.W.; Balestrassi, P.P.

    2010-01-01

    This paper proposes a new hybrid approach based on nonlinear chaotic dynamics and evolutionary strategy to forecast electricity loads and prices. The main idea is to develop a new training or identification stage in a nonlinear chaotic dynamic based predictor. In the training stage five optimal parameters for a chaotic based predictor are searched through an optimization model based on evolutionary strategy. The objective function of the optimization model is the mismatch minimization between the multi-step-ahead forecasting of predictor and observed data such as it is done in identification problems. The first contribution of this paper is that the proposed approach is capable of capturing the complex dynamic of demand and price time series considered resulting in a more accuracy forecasting. The second contribution is that the proposed approach run on-line manner, i.e. the optimal set of parameters and prediction is executed automatically which can be used to prediction in real-time, it is an advantage in comparison with other models, where the choice of their input parameters are carried out off-line, following qualitative/experience-based recipes. A case study of load and price forecasting is presented using data from New England, Alberta, and Spain. A comparison with other methods such as autoregressive integrated moving average (ARIMA) and artificial neural network (ANN) is shown. The results show that the proposed approach provides a more accurate and effective forecasting than ARIMA and ANN methods. (author)

  17. ELECTRONIC EVIDENCE IN THE JUDICIAL PROCEEDINGS AND COMPUTER FORENSIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2017-01-01

    Full Text Available Today’s perspective of the information society is characterized by the terminology of modern dictionaries of globalization including the terms such as convergence, digitization (media, technology and/or telecommunications and mobility of people or technology. Each word with progress, development, a positive sign of the rise of the information society. On the other hand in a virtual environment traditional evidence in judicial proceedings with the document on paper substrate, are becoming electronic evidence, and their management processes and criteria for admissibility are changing over traditional evidence. The rapid growth of computer data created new opportunities and the growth of new forms of computing, and cyber crime, but also the new ways of proof in court cases, which were unavailable just a few decades. The authors of this paper describe new trends in the development of the information society and the emergence of electronic evidence, with emphasis on the impact of the development of computer crime on electronic evidence; the concept, legal regulation and probative value of electronic evidence, and in particular of electronic documents; and the issue of electronic evidence expertise and electronic documents in court proceedings.

  18. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  19. The 3d International Workshop on Computational Electronics

    Science.gov (United States)

    Goodnick, Stephen M.

    1994-09-01

    The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.

  20. Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.

    Science.gov (United States)

    Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai

    2005-10-01

    A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.

  1. Unscented Sampling Techniques For Evolutionary Computation With Applications To Astrodynamic Optimization

    Science.gov (United States)

    2016-09-01

    factors that can cause the variations in trajectory computation time. First of all, these cases are initially computed using the guess-free mode of DIDO... Goldberg [91]. This concept essentially states that fundamental building blocks, or lower order schemata are pieced together by the genetic algorithms in...in Section 3.13.2. While this idea is very straightforward and logical, Goldberg also later points out that there are deceptive problems where these

  2. Management and Valorization of Electronic and Computer Wastes in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    So far, little is known about the extent of the problem and there is little research available to serve as a basis for persuading decision-makers to address it. This project will examine the issue of electronic and computer waste and its management, and endeavor to identify feasible and sustainable strategies for valorizing such ...

  3. Regional Platform on Personal Computer Electronic Waste in Latin ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Regional Platform on Personal Computer Electronic Waste in Latin America and the Caribbean. Donation of ... This project aims to identify environmentally responsible and sustainable solutions to the problem of e-waste. ... Policy in Focus publishes a special issue profiling evidence to empower women in the labour market.

  4. Management and Valorization of Electronic and Computer Wastes in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine the issue of electronic and computer waste and its management, and endeavor to identify feasible and sustainable strategies for ... IDRC congratulates first cohort of Women in Climate Change Science Fellows ... titled “Climate change and adaptive water management: Innovative solutions from the ...

  5. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  6. Application of computational fluid dynamics and surrogate-coupled evolutionary computing to enhance centrifugal-pump performance

    Directory of Open Access Journals (Sweden)

    Sayed Ahmed Imran Bellary

    2016-01-01

    Full Text Available To reduce the total design and optimization time, numerical analysis with surrogate-based approaches is being used in turbomachinery optimization. In this work, multiple surrogates are coupled with an evolutionary genetic algorithm to find the Pareto optimal fronts (PoFs of two centrifugal pumps with different specifications in order to enhance their performance. The two pumps were used a centrifugal pump commonly used in industry (Case I and an electrical submersible pump used in the petroleum industry (Case II. The objectives are to enhance head and efficiency of the pumps at specific flow rates. Surrogates such as response surface approximation (RSA, Kriging (KRG, neural networks and weighted-average surrogates (WASs were used to determine the PoFs. To obtain the objective functions’ values and to understand the flow physics, Reynolds-averaged Navier–Stokes equations were solved. It is found that the WAS performs better for both the objectives than any other individual surrogate. The best individual surrogates or the best predicted error sum of squares (PRESS surrogate (BPS obtained from cross-validation (CV error estimations produced better PoFs but was still unable to compete with the WAS. The high CV error-producing surrogate produced the worst PoFs. The performance improvement in this study is due to the change in flow pattern in the passage of the impeller of the pumps.

  7. Genetic characterization and evolutionary inference of TNF-α through computational analysis

    Directory of Open Access Journals (Sweden)

    Gauri Awasthi

    Full Text Available TNF-α is an important human cytokine that imparts dualism in malaria pathogenicity. At high dosages, TNF-α is believed to provoke pathogenicity in cerebral malaria; while at lower dosages TNF-α is protective against severe human malaria. In order to understand the human TNF-α gene and to ascertain evolutionary aspects of its dualistic nature for malaria pathogenicity, we characterized this gene in detail in six different mammalian taxa. The avian taxon, Gallus gallus was included in our study, as TNF-α is not present in birds; therefore, a tandemly placed duplicate of TNF-α (LT-α or TNF-β was included. A comparative study was made of nucleotide length variations, intron and exon sizes and number variations, differential compositions of coding to non-coding bases, etc., to look for similarities/dissimilarities in the TNF-α gene across all seven taxa. A phylogenetic analysis revealed the pattern found in other genes, as humans, chimpanzees and rhesus monkeys were placed in a single clade, and rats and mice in another; the chicken was in a clearly separate branch. We further focused on these three taxa and aligned the amino acid sequences; there were small differences between humans and chimpanzees; both were more different from the rhesus monkey. Further, comparison of coding and non-coding nucleotide length variations and coding to non-coding nucleotide ratio between TNF-α and TNF-β among these three mammalian taxa provided a first-hand indication of the role of the TNF-α gene, but not of TNF-β in the dualistic nature of TNF-α in malaria pathogenicity.

  8. Computer predictions on Rh-based double perovskites with unusual electronic and magnetic properties

    Science.gov (United States)

    Halder, Anita; Nafday, Dhani; Sanyal, Prabuddha; Saha-Dasgupta, Tanusri

    2018-03-01

    In search for new magnetic materials, we make computer prediction of structural, electronic and magnetic properties of yet-to-be synthesized Rh-based double perovskite compounds, Sr(Ca)2BRhO6 (B=Cr, Mn, Fe). We use combination of evolutionary algorithm, density functional theory, and statistical-mechanical tool for this purpose. We find that the unusual valence of Rh5+ may be stabilized in these compounds through formation of oxygen ligand hole. Interestingly, while the Cr-Rh and Mn-Rh compounds are predicted to be ferromagnetic half-metals, the Fe-Rh compounds are found to be rare examples of antiferromagnetic and metallic transition-metal oxide with three-dimensional electronic structure. The computed magnetic transition temperatures of the predicted compounds, obtained from finite temperature Monte Carlo study of the first principles-derived model Hamiltonian, are found to be reasonably high. The prediction of favorable growth condition of the compounds, reported in our study, obtained through extensive thermodynamic analysis should be useful for future synthesize of this interesting class of materials with intriguing properties.

  9. Composition of the mitochondrial electron transport chain in acanthamoeba castellanii: structural and evolutionary insights.

    Science.gov (United States)

    Gawryluk, Ryan M R; Chisholm, Kenneth A; Pinto, Devanand M; Gray, Michael W

    2012-11-01

    The mitochondrion, derived in evolution from an α-proteobacterial progenitor, plays a key metabolic role in eukaryotes. Mitochondria house the electron transport chain (ETC) that couples oxidation of organic substrates and electron transfer to proton pumping and synthesis of ATP. The ETC comprises several multiprotein enzyme complexes, all of which have counterparts in bacteria. However, mitochondrial ETC assemblies from animals, plants and fungi are generally more complex than their bacterial counterparts, with a number of 'supernumerary' subunits appearing early in eukaryotic evolution. Little is known, however, about the ETC of unicellular eukaryotes (protists), which are key to understanding the evolution of mitochondria and the ETC. We present an analysis of the ETC proteome from Acanthamoeba castellanii, an ecologically, medically and evolutionarily important member of Amoebozoa (sister to Opisthokonta). Data obtained from tandem mass spectrometric (MS/MS) analyses of purified mitochondria as well as ETC complexes isolated via blue native polyacrylamide gel electrophoresis are combined with the results of bioinformatic queries of sequence databases. Our bioinformatic analyses have identified most of the ETC subunits found in other eukaryotes, confirming and extending previous observations. The assignment of proteins as ETC subunits by MS/MS provides important insights into the primary structures of ETC proteins and makes possible, through the use of sensitive profile-based similarity searches, the identification of novel constituents of the ETC along with the annotation of highly divergent but phylogenetically conserved ETC subunits. © 2012 Elsevier B.V. All rights reserved.

  10. Study of natural circulation for the design of a research reactor using computational fluid dynamics and evolutionary computation techniques

    International Nuclear Information System (INIS)

    Oliveira, Andre Felipe da Silva de

    2012-01-01

    Safety is one of the most important and desirable characteristics in a nuclear plant Natural circulation cooling systems are noted for providing passive safety. These systems can be used as mechanism for removing the residual heat from the reactor, or even as the main cooling system for heated sections, such as the core. In this work, a computational fluid dynamics (CFD) code called CFX is used to simulate the process of natural circulation in a research reactor pool after its shutdown. The physical model studied is similar to the Open Pool Australian Light water reactor (OPAL), and contains the core, cooling pool, reflecting tank, circulation pipes and chimney. For best computing performance, the core region was modeled as a porous medium, where the parameters were obtained from a separately detailed CFD analysis. This work also aims to study the viability of the implementation of Differential Evolution algorithm for optimization the physical and operational parameters that, obeying the laws of similarity, lead to a test section on a reduced scale of the reactor pool.

  11. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    Science.gov (United States)

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  12. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  13. [Computer-aided Diagnosis and New Electronic Stethoscope].

    Science.gov (United States)

    Huang, Mei; Liu, Hongying; Pi, Xitian; Ao, Yilu; Wang, Zi

    2017-05-30

    Auscultation is an important method in early-diagnosis of cardiovascular disease and respiratory system disease. This paper presents a computer-aided diagnosis of new electronic auscultation system. It has developed an electronic stethoscope based on condenser microphone and the relevant intelligent analysis software. It has implemented many functions that combined with Bluetooth, OLED, SD card storage technologies, such as real-time heart and lung sounds auscultation in three modes, recording and playback, auscultation volume control, wireless transmission. The intelligent analysis software based on PC computer utilizes C# programming language and adopts SQL Server as the background database. It has realized play and waveform display of the auscultation sound. By calculating the heart rate, extracting the characteristic parameters of T1, T2, T12, T11, it can analyze whether the heart sound is normal, and then generate diagnosis report. Finally the auscultation sound and diagnosis report can be sent to mailbox of other doctors, which can carry out remote diagnosis. The whole system has features of fully function, high portability, good user experience, and it is beneficial to promote the use of electronic stethoscope in the hospital, at the same time, the system can also be applied to auscultate teaching and other occasions.

  14. Electron beam treatment planning: A review of dose computation methods

    International Nuclear Information System (INIS)

    Mohan, R.; Riley, R.; Laughlin, J.S.

    1983-01-01

    Various methods of dose computations are reviewed. The equivalent path length methods used to account for body curvature and internal structure are not adequate because they ignore the lateral diffusion of electrons. The Monte Carlo method for the broad field three-dimensional situation in treatment planning is impractical because of the enormous computer time required. The pencil beam technique may represent a suitable compromise. The behavior of a pencil beam may be described by the multiple scattering theory or, alternatively, generated using the Monte Carlo method. Although nearly two orders of magnitude slower than the equivalent path length technique, the pencil beam method improves accuracy sufficiently to justify its use. It applies very well when accounting for the effect of surface irregularities; the formulation for handling inhomogeneous internal structure is yet to be developed

  15. New Computational Approach to Electron Transport in Irregular Graphene Nanostructures

    Science.gov (United States)

    Mason, Douglas; Heller, Eric; Prendergast, David; Neaton, Jeffrey

    2009-03-01

    For novel graphene devices of nanoscale-to-macroscopic scale, many aspects of their transport properties are not easily understood due to difficulties in fabricating devices with regular edges. Here we develop a framework to efficiently calculate and potentially screen electronic transport properties of arbitrary nanoscale graphene device structures. A generalization of the established recursive Green's function method is presented, providing access to arbitrary device and lead geometries with substantial computer-time savings. Using single-orbital nearest-neighbor tight-binding models and the Green's function-Landauer scattering formalism, we will explore the transmission function of irregular two-dimensional graphene-based nanostructures with arbitrary lead orientation. Prepared by LBNL under contract DE-AC02-05CH11231 and supported by the U.S. Dept. of Energy Computer Science Graduate Fellowship under grant DE-FG02-97ER25308.

  16. Design of Carborane Molecular Architectures via Electronic Structure Computations

    International Nuclear Information System (INIS)

    Oliva, J.M.; Serrano-Andres, L.; Klein, D.J.; Schleyer, P.V.R.; Mich, J.

    2009-01-01

    Quantum-mechanical electronic structure computations were employed to explore initial steps towards a comprehensive design of poly carborane architectures through assembly of molecular units. Aspects considered were (i) the striking modification of geometrical parameters through substitution, (ii) endohedral carboranes and proposed ejection mechanisms for energy/ion/atom/energy storage/transport, (iii) the excited state character in single and dimeric molecular units, and (iv) higher architectural constructs. A goal of this work is to find optimal architectures where atom/ion/energy/spin transport within carborane superclusters is feasible in order to modernize and improve future photo energy processes.

  17. Search of computers for discovery of electronic evidence

    Directory of Open Access Journals (Sweden)

    Pisarić Milana M.

    2015-01-01

    Full Text Available In order to address the specific nature of criminal activities committed using computer networks and systems, the efforts of states to adapt or complement the existing criminal law with purposeful provisions is understandable. To create an appropriate legal framework for supressing cybercrime, except the rules of substantive criminal law predict certain behavior as criminal offenses against the confidentiality, integrity and availability of computer data, computer systems and networks, it is essential that the provisions of the criminal procedure law contain adequate powers of competent authorities for detecting sources of illegal activities, or the collection of data on the committed criminal offense and offender, which can be used as evidence in criminal proceedings, taking into account the specificities of cyber crime and the environment within which the illegal activity is undertaken. Accordingly, the provisions of the criminal procedural law should be designed to be able to overcome certain challenges in discovering and proving high technology crime, and the provisions governing search of computer for discovery of electronic evidence is of special importance.

  18. Comparing two iteration algorithms of Broyden electron density mixing through an atomic electronic structure computation

    International Nuclear Information System (INIS)

    Zhang Man-Hong

    2016-01-01

    By performing the electronic structure computation of a Si atom, we compare two iteration algorithms of Broyden electron density mixing in the literature. One was proposed by Johnson and implemented in the well-known VASP code. The other was given by Eyert. We solve the Kohn-Sham equation by using a conventional outward/inward integration of the differential equation and then connect two parts of solutions at the classical turning points, which is different from the method of the matrix eigenvalue solution as used in the VASP code. Compared to Johnson’s algorithm, the one proposed by Eyert needs fewer total iteration numbers. (paper)

  19. Inductive reasoning and forecasting of population dynamics of Cylindrospermopsis raciborskii in three sub-tropical reservoirs by evolutionary computation.

    Science.gov (United States)

    Recknagel, Friedrich; Orr, Philip T; Cao, Hongqing

    2014-01-01

    Seven-day-ahead forecasting models of Cylindrospermopsis raciborskii in three warm-monomictic and mesotrophic reservoirs in south-east Queensland have been developed by means of water quality data from 1999 to 2010 and the hybrid evolutionary algorithm HEA. Resulting models using all measured variables as inputs as well as models using electronically measurable variables only as inputs forecasted accurately timing of overgrowth of C. raciborskii and matched well high and low magnitudes of observed bloom events with 0.45≤r 2 >0.61 and 0.4≤r 2 >0.57, respectively. The models also revealed relationships and thresholds triggering bloom events that provide valuable information on synergism between water quality conditions and population dynamics of C. raciborskii. Best performing models based on using all measured variables as inputs indicated electrical conductivity (EC) within the range of 206-280mSm -1 as threshold above which fast growth and high abundances of C. raciborskii have been observed for the three lakes. Best models based on electronically measurable variables for the Lakes Wivenhoe and Somerset indicated a water temperature (WT) range of 25.5-32.7°C within which fast growth and high abundances of C. raciborskii can be expected. By contrast the model for Lake Samsonvale highlighted a turbidity (TURB) level of 4.8 NTU as indicator for mass developments of C. raciborskii. Experiments with online measured water quality data of the Lake Wivenhoe from 2007 to 2010 resulted in predictive models with 0.61≤r 2 >0.65 whereby again similar levels of EC and WT have been discovered as thresholds for outgrowth of C. raciborskii. The highest validity of r 2 =0.75 for an in situ data-based model has been achieved after considering time lags for EC by 7 days and dissolved oxygen by 1 day. These time lags have been discovered by a systematic screening of all possible combinations of time lags between 0 and 10 days for all electronically measurable variables. The so

  20. Collaborative Computational Project for Electron cryo-Microscopy

    International Nuclear Information System (INIS)

    Wood, Chris; Burnley, Tom; Patwardhan, Ardan; Scheres, Sjors; Topf, Maya; Roseman, Alan; Winn, Martyn

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed

  1. Collaborative Computational Project for Electron cryo-Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Chris; Burnley, Tom [Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom); Patwardhan, Ardan [European Molecular Biology Laboratory, Wellcome Trust Genome Campus, Hinxton, Cambridge CB10 1SD (United Kingdom); Scheres, Sjors [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH (United Kingdom); Topf, Maya [University of London, Malet Street, London WC1E 7HX (United Kingdom); Roseman, Alan [University of Manchester, Oxford Road, Manchester M13 9PT (United Kingdom); Winn, Martyn, E-mail: martyn.winn@stfc.ac.uk [Science and Technology Facilities Council, Daresbury Laboratory, Warrington WA4 4AD (United Kingdom); Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom)

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed.

  2. Computer Conferencing and Electronic Messaging. Conference Proceedings (Guelph, Ontario, Canada, January 22-23, 1985).

    Science.gov (United States)

    Guelph Univ. (Ontario).

    This 21-paper collection examines various issues in electronic networking and conferencing with computers, including design issues, conferencing in education, electronic messaging, computer conferencing applications, social issues of computer conferencing, and distributed computer conferencing. In addition to a keynote address, "Computer…

  3. Simulation of the behaviour of electron-optical systems using a parallel computer

    International Nuclear Information System (INIS)

    Balladore, J.L.; Hawkes, P.W.

    1990-01-01

    The advantage of using a multiprocessor computer for the calculation of electron-optical properties is investigated. A considerable reduction of computing time is obtained by reorganising the finite-element field computation. (orig.)

  4. Current algorithms for computed electron beam dose planning

    International Nuclear Information System (INIS)

    Brahme, A.

    1985-01-01

    Two- and sometimes three-dimensional computer algorithms for electron beam irradiation are capable of taking all irregularities of the body cross-section and the properties of the various tissues into account. This is achieved by dividing the incoming broad beams into a number of narrow pencil beams, the penetration of which can be described by essentially one-dimensional formalisms. The constituent pencil beams are most often described by Gaussian, experimentally or theoretically derived distributions. The accuracy of different dose planning algorithms is discussed in some detail based on their ability to take the different physical interaction processes of high energy electrons into account. It is shown that those programs that take the deviations from the simple Gaussian model into account give the best agreement with experimental results. With such programs a dosimetric relative accuracy of about 5% is generally achieved except in the most complex inhomogeneity configurations. Finally, the present limitations and possible future developments of electron dose planning are discussed. (orig.)

  5. Statistical analysis and definition of blockages-prediction formulae for the wastewater network of Oslo by evolutionary computing.

    Science.gov (United States)

    Ugarelli, Rita; Kristensen, Stig Morten; Røstum, Jon; Saegrov, Sveinung; Di Federico, Vittorio

    2009-01-01

    Oslo Vann og Avløpsetaten (Oslo VAV)-the water/wastewater utility in the Norwegian capital city of Oslo-is assessing future strategies for selection of most reliable materials for wastewater networks, taking into account not only material technical performance but also material performance, regarding operational condition of the system.The research project undertaken by SINTEF Group, the largest research organisation in Scandinavia, NTNU (Norges Teknisk-Naturvitenskapelige Universitet) and Oslo VAV adopts several approaches to understand reasons for failures that may impact flow capacity, by analysing historical data for blockages in Oslo.The aim of the study was to understand whether there is a relationship between the performance of the pipeline and a number of specific attributes such as age, material, diameter, to name a few. This paper presents the characteristics of the data set available and discusses the results obtained by performing two different approaches: a traditional statistical analysis by segregating the pipes into classes, each of which with the same explanatory variables, and a Evolutionary Polynomial Regression model (EPR), developed by Technical University of Bari and University of Exeter, to identify possible influence of pipe's attributes on the total amount of predicted blockages in a period of time.Starting from a detailed analysis of the available data for the blockage events, the most important variables are identified and a classification scheme is adopted.From the statistical analysis, it can be stated that age, size and function do seem to have a marked influence on the proneness of a pipeline to blockages, but, for the reduced sample available, it is difficult to say which variable it is more influencing. If we look at total number of blockages the oldest class seems to be the most prone to blockages, but looking at blockage rates (number of blockages per km per year), then it is the youngest class showing the highest blockage rate

  6. A Crafts-Oriented Approach to Computing in High School: Introducing Computational Concepts, Practices, and Perspectives with Electronic Textiles

    Science.gov (United States)

    Kafai, Yasmin B.; Lee, Eunkyoung; Searle, Kristin; Fields, Deborah; Kaplan, Eliot; Lui, Debora

    2014-01-01

    In this article, we examine the use of electronic textiles (e-textiles) for introducing key computational concepts and practices while broadening perceptions about computing. The starting point of our work was the design and implementation of a curriculum module using the LilyPad Arduino in a pre-AP high school computer science class. To…

  7. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  8. Trapped electron decay by the thermally-assisted tunnelling to electron acceptors in glassy matrices. A computer simulation study

    International Nuclear Information System (INIS)

    Feret, B.; Bartczak, W.M.; Kroh, J.

    1991-01-01

    The Redi-Hopefield quantum mechanical model of the thermally-assisted electron transfer has been applied to simulate the decay of trapped electrons by tunnelling to electron acceptor molecules added to the glassy matrix. It was assumed that the electron energy levels in donors and acceptors are statistically distributed and the electron excess energy after transfer is dissipated in the medium by the electron-phonon coupling. The electron decay curves were obtained by the method of computer simulation. It was found that for a given medium there exists a certain preferred value of the electronic excess energy which can be effectively converted into the matrix vibrations. If the mismatch of the electron states on the donor and acceptor coincides with the ''resonance'' energy the overall kinetics of electron transfer is accelerated. (author)

  9. Computer simulation of electronic excitation in atomic collision cascades

    International Nuclear Information System (INIS)

    Duvenbeck, A.

    2007-01-01

    The impact of an keV atomic particle onto a solid surface initiates a complex sequence of collisions among target atoms in a near-surface region. The temporal and spatial evolution of this atomic collision cascade leads to the emission of particles from the surface - a process usually called sputtering. In modern surface analysis the so called SIMS technology uses the flux of sputtered particles as a source of information on the microscopical stoichiometric structure in the proximity of the bombarded surface spots. By laterally varying the bombarding spot on the surface, the entire target can be scanned and chemically analyzed. However, the particle detection, which bases upon deflection in electric fields, is limited to those species that leave the surface in an ionized state. Due to the fact that the ionized fraction of the total flux of sputtered atoms often only amounts to a few percent or even less, the detection is often hampered by rather low signals. Moreover, it is well known, that the ionization probability of emitted particles does not only depend on the elementary species, but also on the local environment from which a particle leaves the surface. Therefore, the measured signals for different sputtered species do not necessarily represent the stoichiometric composition of the sample. In the literature, this phenomenon is known as the Matrix Effect in SIMS. In order to circumvent this principal shortcoming of SIMS, the present thesis develops an alternative computer simulation concept, which treats the electronic energy losses of all moving atoms as excitation sources feeding energy into the electronic sub-system of the solid. The particle kinetics determining the excitation sources are delivered by classical molecular dynamics. The excitation energy calculations are combined with a diffusive transport model to describe the spread of excitation energy from the initial point of generation. Calculation results yield a space- and time-resolved excitation

  10. Computer simulation of electronic excitation in atomic collision cascades

    Energy Technology Data Exchange (ETDEWEB)

    Duvenbeck, A.

    2007-04-05

    The impact of an keV atomic particle onto a solid surface initiates a complex sequence of collisions among target atoms in a near-surface region. The temporal and spatial evolution of this atomic collision cascade leads to the emission of particles from the surface - a process usually called sputtering. In modern surface analysis the so called SIMS technology uses the flux of sputtered particles as a source of information on the microscopical stoichiometric structure in the proximity of the bombarded surface spots. By laterally varying the bombarding spot on the surface, the entire target can be scanned and chemically analyzed. However, the particle detection, which bases upon deflection in electric fields, is limited to those species that leave the surface in an ionized state. Due to the fact that the ionized fraction of the total flux of sputtered atoms often only amounts to a few percent or even less, the detection is often hampered by rather low signals. Moreover, it is well known, that the ionization probability of emitted particles does not only depend on the elementary species, but also on the local environment from which a particle leaves the surface. Therefore, the measured signals for different sputtered species do not necessarily represent the stoichiometric composition of the sample. In the literature, this phenomenon is known as the Matrix Effect in SIMS. In order to circumvent this principal shortcoming of SIMS, the present thesis develops an alternative computer simulation concept, which treats the electronic energy losses of all moving atoms as excitation sources feeding energy into the electronic sub-system of the solid. The particle kinetics determining the excitation sources are delivered by classical molecular dynamics. The excitation energy calculations are combined with a diffusive transport model to describe the spread of excitation energy from the initial point of generation. Calculation results yield a space- and time-resolved excitation

  11. Computational Nanotechnology of Molecular Materials, Electronics, and Actuators with Carbon Nanotubes and Fullerenes

    Science.gov (United States)

    Srivastava, Deepak; Menon, Madhu; Cho, Kyeongjae; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The role of computational nanotechnology in developing next generation of multifunctional materials, molecular scale electronic and computing devices, sensors, actuators, and machines is described through a brief review of enabling computational techniques and few recent examples derived from computer simulations of carbon nanotube based molecular nanotechnology.

  12. Evolutionary Nephrology.

    Science.gov (United States)

    Chevalier, Robert L

    2017-05-01

    Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as "maladaptive." In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic) adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ~40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons), evolutionary selection for APOL1 mutations (that provide resistance to trypanosome infection, a tradeoff), and modern life experience (Western diet mismatch leading to diabetes and hypertension). Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo), developmental programming and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  13. Optimum topology for radial networks by using evolutionary computer programming; Topologia optima de redes radiais utilizando programacao evolucionaria

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Joao Luis [Instituto de Engenhariade Sistemas e Computadores (INESC), Porto (Portugal). E-mail: jpinto@duque.inescn.pt; Proenca, Luis Miguel [Instituto Superior de Linguas e Administracao (ISLA), Gaia (Portugal). E-mail: lproenca@inescn.pt

    1999-07-01

    This paper describes the using of Evolutionary Programming techniques for determination of the radial electric network topology, considering investment costs and losses. The work aims to demonstrate the particular easiness of coding and implementation and the parallelism implicit to the method as well, giving outstanding performance levels. As test example, a 43 bars and 75 alternative lines network has been used by describing an implementation of the algorithm in an Object Oriented platform.

  14. Two dimensional electron systems for solid state quantum computation

    Science.gov (United States)

    Mondal, Sumit

    Two dimensional electron systems based on GaAs/AlGaAs heterostructures are extremely useful in various scientific investigations of recent times including the search for quantum computational schemes. Although significant strides have been made over the past few years to realize solid state qubits on GaAs/AlGaAs 2DEGs, there are numerous factors limiting the progress. We attempt to identify factors that have material and design-specific origin and develop ways to overcome them. The thesis is divided in two broad segments. In the first segment we describe the realization of a new field-effect induced two dimensional electron system on GaAs/AlGaAs heterostructure where the novel device-design is expected to suppress the level of charge noise present in the device. Modulation-doped GaAs/AlGaAs heterostructures are utilized extensively in the study of quantum transport in nanostructures, but charge fluctuations associated with remote ionized dopants often produce deleterious effects. Electric field-induced carrier systems offer an attractive alternative if certain challenges can be overcome. We demonstrate a field-effect transistor in which the active channel is locally devoid of modulation-doping, but silicon dopant atoms are retained in the ohmic contact region to facilitate low-resistance contacts. A high quality two-dimensional electron gas is induced by a field-effect that is tunable over a density range of 6.5x10 10cm-2 to 2.6x1011cm-2 . Device design, fabrication, and low temperature (T=0.3K) characterization results are discussed. The demonstrated device-design overcomes several existing limitations in the fabrication of field-induced 2DEGs and might find utility in hosting nanostructures required for making spin qubits. The second broad segment describes our effort to correlate transport parameters measured at T=0.3K to the strength of the fractional quantum Hall state observed at nu=5/2 in the second Landau level of high-mobility GaAs/AlGaAs two dimensional

  15. Characterization of electronics devices for computed tomography dosimetry

    International Nuclear Information System (INIS)

    Paschoal, Cinthia Marques Magalhaes

    2012-01-01

    Computed tomography (CT) is an examination of high diagnostic capability that delivers high doses of radiation compared with other diagnostic radiological examinations. The current CT dosimetry is mainly made by using a 100 mm long ionization chamber. However, it was verified that this extension, which is intended to collect ali scattered radiation of the single slice dose profile in CT, is not enough. An alternative dosimetry has been suggested by translating smaller detectors. In this work, commercial electronics devices of small dimensions were characterized for CT dosimetry. The project can be divided in five parts: a) pre-selection of devices; b) electrical characterization of selected devices; e) dosimetric characterization in Iaboratory, using radiation qualities specific to CT, and in a tomograph; d) evaluation of the dose profile in CT scanner (free in air and in head and body dosimetric phantom); e) evaluation of the new MSAD detector in a tomograph. The selected devices were OP520 and OP521 phototransistors and BPW34FS photodiode. Before the dosimetric characterization, three configurations of detectors, with 4, 2 and 1 OP520 phototransistor working as a single detector, were evaluated and the configuration with only one device was the most adequate. Hence, the following tests, for all devices, were made using the configuration with only one device. The tests of dosimetric characterization in laboratory and in a tomograph were: energy dependence, response as a function of air kerma (laboratory) and CTDI 100 (scanner), sensitivity variation and angular dependence. In both characterizations, the devices showed some energy dependence, indicating the need of correction factors depending on the beam energy; their response was linear with the air kerma and the CTDI 100 ; the OP520 phototransistor showed the largest variation in sensitivity with the irradiation and the photodiode was the most stable; the angular dependence was significant in the laboratory and

  16. Computer-assisted expert case definition in electronic health records.

    Science.gov (United States)

    Walker, Alexander M; Zhou, Xiaofeng; Ananthakrishnan, Ashwin N; Weiss, Lisa S; Shen, Rongjun; Sobel, Rachel E; Bate, Andrew; Reynolds, Robert F

    2016-02-01

    To describe how computer-assisted presentation of case data can lead experts to infer machine-implementable rules for case definition in electronic health records. As an illustration the technique has been applied to obtain a definition of acute liver dysfunction (ALD) in persons with inflammatory bowel disease (IBD). The technique consists of repeatedly sampling new batches of case candidates from an enriched pool of persons meeting presumed minimal inclusion criteria, classifying the candidates by a machine-implementable candidate rule and by a human expert, and then updating the rule so that it captures new distinctions introduced by the expert. Iteration continues until an update results in an acceptably small number of changes to form a final case definition. The technique was applied to structured data and terms derived by natural language processing from text records in 29,336 adults with IBD. Over three rounds the technique led to rules with increasing predictive value, as the experts identified exceptions, and increasing sensitivity, as the experts identified missing inclusion criteria. In the final rule inclusion and exclusion terms were often keyed to an ALD onset date. When compared against clinical review in an independent test round, the derived final case definition had a sensitivity of 92% and a positive predictive value of 79%. An iterative technique of machine-supported expert review can yield a case definition that accommodates available data, incorporates pre-existing medical knowledge, is transparent and is open to continuous improvement. The expert updates to rules may be informative in themselves. In this limited setting, the final case definition for ALD performed better than previous, published attempts using expert definitions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Pictorial review: Electron beam computed tomography and multislice spiral computed tomography for cardiac imaging

    International Nuclear Information System (INIS)

    Lembcke, Alexander; Hein, Patrick A.; Dohmen, Pascal M.; Klessen, Christian; Wiese, Till H.; Hoffmann, Udo; Hamm, Bernd; Enzweiler, Christian N.H.

    2006-01-01

    Electron beam computed tomography (EBCT) revolutionized cardiac imaging by combining a constant high temporal resolution with prospective ECG triggering. For years, EBCT was the primary technique for some non-invasive diagnostic cardiac procedures such as calcium scoring and non-invasive angiography of the coronary arteries. Multislice spiral computed tomography (MSCT) on the other hand significantly advanced cardiac imaging through high volume coverage, improved spatial resolution and retrospective ECG gating. This pictorial review will illustrate the basic differences between both modalities with special emphasis to their image quality. Several experimental and clinical examples demonstrate the strengths and limitations of both imaging modalities in an intraindividual comparison for a broad range of diagnostic applications such as coronary artery calcium scoring, coronary angiography including stent visualization as well as functional assessment of the cardiac ventricles and valves. In general, our examples indicate that EBCT suffers from a number of shortcomings such as limited spatial resolution and a low contrast-to-noise ratio. Thus, EBCT should now only be used in selected cases where a constant high temporal resolution is a crucial issue, such as dynamic (cine) imaging. Due to isotropic submillimeter spatial resolution and retrospective data selection MSCT seems to be the non-invasive method of choice for cardiac imaging in general, and for assessment of the coronary arteries in particular. However, technical developments are still needed to further improve the temporal resolution in MSCT and to reduce the substantial radiation exposure

  18. 77 FR 27078 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-05-08

    ... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...

  19. 3D computational mechanics elucidate the evolutionary implications of orbit position and size diversity of early amphibians.

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    Full Text Available For the first time in vertebrate palaeontology, the potential of joining Finite Element Analysis (FEA and Parametrical Analysis (PA is used to shed new light on two different cranial parameters from the orbits to evaluate their biomechanical role and evolutionary patterns. The early tetrapod group of Stereospondyls, one of the largest groups of Temnospondyls is used as a case study because its orbits position and size vary hugely within the members of this group. An adult skull of Edingerella madagascariensis was analysed using two different cases of boundary and loading conditions in order to quantify stress and deformation response under a bilateral bite and during skull raising. Firstly, the variation of the original geometry of its orbits was introduced in the models producing new FEA results, allowing the exploration of the ecomorphology, feeding strategy and evolutionary patterns of these top predators. Secondly, the quantitative results were analysed in order to check if the orbit size and position were correlated with different stress patterns. These results revealed that in most of the cases the stress distribution is not affected by changes in the size and position of the orbit. This finding supports the high mechanical plasticity of this group during the Triassic period. The absence of mechanical constraints regarding the orbit probably promoted the ecomorphological diversity acknowledged for this group, as well as its ecological niche differentiation in the terrestrial Triassic ecosystems in clades as lydekkerinids, trematosaurs, capitosaurs or metoposaurs.

  20. 3D Computational Mechanics Elucidate the Evolutionary Implications of Orbit Position and Size Diversity of Early Amphibians

    Science.gov (United States)

    Marcé-Nogué, Jordi; Fortuny, Josep; De Esteban-Trivigno, Soledad; Sánchez, Montserrat; Gil, Lluís; Galobart, Àngel

    2015-01-01

    For the first time in vertebrate palaeontology, the potential of joining Finite Element Analysis (FEA) and Parametrical Analysis (PA) is used to shed new light on two different cranial parameters from the orbits to evaluate their biomechanical role and evolutionary patterns. The early tetrapod group of Stereospondyls, one of the largest groups of Temnospondyls is used as a case study because its orbits position and size vary hugely within the members of this group. An adult skull of Edingerella madagascariensis was analysed using two different cases of boundary and loading conditions in order to quantify stress and deformation response under a bilateral bite and during skull raising. Firstly, the variation of the original geometry of its orbits was introduced in the models producing new FEA results, allowing the exploration of the ecomorphology, feeding strategy and evolutionary patterns of these top predators. Secondly, the quantitative results were analysed in order to check if the orbit size and position were correlated with different stress patterns. These results revealed that in most of the cases the stress distribution is not affected by changes in the size and position of the orbit. This finding supports the high mechanical plasticity of this group during the Triassic period. The absence of mechanical constraints regarding the orbit probably promoted the ecomorphological diversity acknowledged for this group, as well as its ecological niche differentiation in the terrestrial Triassic ecosystems in clades as lydekkerinids, trematosaurs, capitosaurs or metoposaurs. PMID:26107295

  1. Evolutionary Nephrology

    Directory of Open Access Journals (Sweden)

    Robert L. Chevalier

    2017-05-01

    Full Text Available Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as “maladaptive.” In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or from evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ∼40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons, evolutionary selection for APOL1 mutations (which provide resistance to trypanosome infection, a tradeoff, and modern life experience (Western diet mismatch leading to diabetes and hypertension. Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout the life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo, developmental programming, and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  2. Application of advanced electronics to a future spacecraft computer design

    Science.gov (United States)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  3. Using Computer Conferencing and Electronic Mail to Facilitate Group Projects.

    Science.gov (United States)

    Anderson, Margaret D.

    1996-01-01

    Reports on the use of electronic mail and an electronic conferencing system to conduct group projects in three educational psychology courses at the State University of New York College at Cortland. Course design is explained and group project design is described, including assignments and oral presentations during regular class sessions.…

  4. Evolutionary thinking

    Science.gov (United States)

    Hunt, Tam

    2014-01-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution—both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place—has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps’ book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging “integral” or “evolutionary” cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps. PMID:26478766

  5. Evolutionary Demography

    DEFF Research Database (Denmark)

    Levitis, Daniel

    2015-01-01

    of biological and cultural evolution. Demographic variation within and among human populations is influenced by our biology, and therefore by natural selection and our evolutionary background. Demographic methods are necessary for studying populations of other species, and for quantifying evolutionary fitness......Demography is the quantitative study of population processes, while evolution is a population process that influences all aspects of biological organisms, including their demography. Demographic traits common to all human populations are the products of biological evolution or the interaction...

  6. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  7. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  8. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  9. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.

    2017-01-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  10. Computational Nanotechnology of Molecular Materials, Electronics and Machines

    Science.gov (United States)

    Srivastava, D.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    This viewgraph presentation covers carbon nanotubes, their characteristics, and their potential future applications. The presentation include predictions on the development of nanostructures and their applications, the thermal characteristics of carbon nanotubes, mechano-chemical effects upon carbon nanotubes, molecular electronics, and models for possible future nanostructure devices. The presentation also proposes a neural model for signal processing.

  11. Evolutionary Expectations

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    , they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical...... cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary......The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...

  12. [Evolutionary medicine].

    Science.gov (United States)

    Wjst, M

    2013-12-01

    Evolutionary medicine allows new insights into long standing medical problems. Are we "really stoneagers on the fast lane"? This insight might have enormous consequences and will allow new answers that could never been provided by traditional anthropology. Only now this is made possible using data from molecular medicine and systems biology. Thereby evolutionary medicine takes a leap from a merely theoretical discipline to practical fields - reproductive, nutritional and preventive medicine, as well as microbiology, immunology and psychiatry. Evolutionary medicine is not another "just so story" but a serious candidate for the medical curriculum providing a universal understanding of health and disease based on our biological origin. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Evolutionary Awareness

    Directory of Open Access Journals (Sweden)

    Gregory Gorelik

    2014-10-01

    Full Text Available In this article, we advance the concept of “evolutionary awareness,” a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities—which we refer to as “intergenerational extended phenotypes”—by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.

  14. Industrial Applications of Evolutionary Algorithms

    CERN Document Server

    Sanchez, Ernesto; Tonda, Alberto

    2012-01-01

    This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the

  15. International Conference on Emerging Research in Electronics, Computer Science and Technology

    CERN Document Server

    Sheshadri, Holalu; Padma, M

    2014-01-01

    PES College of Engineering is organizing an International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT-12) in Mandya and merging the event with Golden Jubilee of the Institute. The Proceedings of the Conference presents high quality, peer reviewed articles from the field of Electronics, Computer Science and Technology. The book is a compilation of research papers from the cutting-edge technologies and it is targeted towards the scientific community actively involved in research activities.

  16. Computational analyses of an evolutionary arms race between mammalian immunity mediated by immunoglobulin A and its subversion by bacterial pathogens.

    Directory of Open Access Journals (Sweden)

    Ana Pinheiro

    Full Text Available IgA is the predominant immunoglobulin isotype in mucosal tissues and external secretions, playing important roles both in defense against pathogens and in maintenance of commensal microbiota. Considering the complexity of its interactions with the surrounding environment, IgA is a likely target for diversifying or positive selection. To investigate this possibility, the action of natural selection on IgA was examined in depth with six different methods: CODEML from the PAML package and the SLAC, FEL, REL, MEME and FUBAR methods implemented in the Datamonkey webserver. In considering just primate IgA, these analyses show that diversifying selection targeted five positions of the Cα1 and Cα2 domains of IgA. Extending the analysis to include other mammals identified 18 positively selected sites: ten in Cα1, five in Cα2 and three in Cα3. All but one of these positions display variation in polarity and charge. Their structural locations suggest they indirectly influence the conformation of sites on IgA that are critical for interaction with host IgA receptors and also with proteins produced by mucosal pathogens that prevent their elimination by IgA-mediated effector mechanisms. Demonstrating the plasticity of IgA in the evolution of different groups of mammals, only two of the eighteen selected positions in all mammals are included in the five selected positions in primates. That IgA residues subject to positive selection impact sites targeted both by host receptors and subversive pathogen ligands highlights the evolutionary arms race playing out between mammals and pathogens, and further emphasizes the importance of IgA in protection against mucosal pathogens.

  17. Computer Mediated Communication and the Emergence of "Electronic Opportunism"

    OpenAIRE

    Rocco, Elena; Warglien, Massimo

    1996-01-01

    An experiment on how communication affects cooperation in a social dilemma shows that computer mediated communication (CMC) and face to face communication have markedly different effects on patterns of collective behavior. While face to face communication sustains stable cooperation, CMC makes cooperative agreements in groups extremely fragile, giving rise to waves of opportunistic behavior. Further analysis of communication protocols highlights that the breakdown of ordinary communication ru...

  18. A FORTRAN program for an IBM PC compatible computer for calculating kinematical electron diffraction patterns

    International Nuclear Information System (INIS)

    Skjerpe, P.

    1989-01-01

    This report describes a computer program which is useful in transmission electron microscopy. The program is written in FORTRAN and calculates kinematical electron diffraction patterns in any zone axis from a given crystal structure. Quite large unit cells, containing up to 2250 atoms, can be handled by the program. The program runs on both the Helcules graphic card and the standard IBM CGA card

  19. Computer Cataloging of Electronic Journals in Unstable Aggregator Databases: The Hong Kong Baptist University Library Experience.

    Science.gov (United States)

    Li, Yiu-On; Leung, Shirley W.

    2001-01-01

    Discussion of aggregator databases focuses on a project at the Hong Kong Baptist University library to integrate full-text electronic journal titles from three unstable aggregator databases into its online public access catalog (OPAC). Explains the development of the electronic journal computer program (EJCOP) to generate MARC records for…

  20. Information Technology in project-organized electronic and computer technology engineering education

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    1999-01-01

    This paper describes the integration of IT in the education of electronic and computer technology engineers at Institute of Electronic Systems, Aalborg Uni-versity, Denmark. At the Institute Information Technology is an important tool in the aspects of the education as well as for communication...

  1. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-06-08

    ... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... the United States after importation of certain electronic devices, including mobile phones and tablet... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...

  2. Effectiveness of an Electronic Performance Support System on Computer Ethics and Ethical Decision-Making Education

    Science.gov (United States)

    Kert, Serhat Bahadir; Uz, Cigdem; Gecu, Zeynep

    2014-01-01

    This study examined the effectiveness of an electronic performance support system (EPSS) on computer ethics education and the ethical decision-making processes. There were five different phases to this ten month study: (1) Writing computer ethics scenarios, (2) Designing a decision-making framework (3) Developing EPSS software (4) Using EPSS in a…

  3. Evolutionary robotics

    Indian Academy of Sciences (India)

    In evolutionary robotics, a suitable robot control system is developed automatically through evolution due to the interactions between the robot and its environment. It is a complicated task, as the robot and the environment constitute a highly dynamical system. Several methods have been tried by various investigators to ...

  4. Computer experiments on the imaging of point defects with the conventional transmission electron microscope

    Energy Technology Data Exchange (ETDEWEB)

    Krakow, W [Xerox Corp., Rochester, N.Y. (USA)

    1978-02-01

    To aid in the interpretation of high resolution electron micrographs of defect structures in crystals, computer-simulated dark-field electron micrographs have been obtained for a variety of point defects in metals. Interpretation of these images in terms of atomic positions and atom correlations becomes straightforward, and it is a simple matter to distinguish between real structural information and image artifacts produced by the phase contrast mechanism in the electron optical imaging process.

  5. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Science.gov (United States)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.

    2017-06-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.

  6. Computer-based Role Playing Game Environment for Analogue Electronics

    Directory of Open Access Journals (Sweden)

    Lachlan M MacKinnon

    2009-02-01

    Full Text Available An implementation of a design for a game based virtual learning environment is described. The game is developed for a course in analogue electronics, and the topic is the design of a power supply. This task can be solved in a number of different ways, with certain constraints, giving the students a certain amount of freedom, although the game is designed not to facilitate trial-and-error approach. The use of storytelling and a virtual gaming environment provides the student with the learning material in a MMORPG environment.

  7. Two-parametric model of electron beam in computational dosimetry for radiation processing

    International Nuclear Information System (INIS)

    Lazurik, V.M.; Lazurik, V.T.; Popov, G.; Zimek, Z.

    2016-01-01

    Computer simulation of irradiation process of various materials with electron beam (EB) can be applied to correct and control the performances of radiation processing installations. Electron beam energy measurements methods are described in the international standards. The obtained results of measurements can be extended by implementation computational dosimetry. Authors have developed the computational method for determination of EB energy on the base of two-parametric fitting of semi-empirical model for the depth dose distribution initiated by mono-energetic electron beam. The analysis of number experiments show that described method can effectively consider random displacements arising from the use of aluminum wedge with a continuous strip of dosimetric film and minimize the magnitude uncertainty value of the electron energy evaluation, calculated from the experimental data. Two-parametric fitting method is proposed for determination of the electron beam model parameters. These model parameters are as follow: E 0 – energy mono-energetic and mono-directional electron source, X 0 – the thickness of the aluminum layer, located in front of irradiated object. That allows obtain baseline data related to the characteristic of the electron beam, which can be later on applied for computer modeling of the irradiation process. Model parameters which are defined in the international standards (like E p – the most probably energy and R p – practical range) can be linked with characteristics of two-parametric model (E 0 , X 0 ), which allows to simulate the electron irradiation process. The obtained data from semi-empirical model were checked together with the set of experimental results. The proposed two-parametric model for electron beam energy evaluation and estimation of accuracy for computational dosimetry methods on the base of developed model are discussed. - Highlights: • Experimental and computational methods of electron energy evaluation. • Development

  8. Incorporating electronic-based and computer-based strategies: graduate nursing courses in administration.

    Science.gov (United States)

    Graveley, E; Fullerton, J T

    1998-04-01

    The use of electronic technology allows faculty to improve their course offerings. Four graduate courses in nursing administration were contemporized to incorporate fundamental computer-based skills that would be expected of graduates in the work setting. Principles of adult learning offered a philosophical foundation that guided course development and revision. Course delivery strategies included computer-assisted instructional modules, e-mail interactive discussion groups, and use of the electronic classroom. Classroom seminar discussions and two-way interactive video conferencing focused on group resolution of problems derived from employment settings and assigned readings. Using these electronic technologies, a variety of courses can be revised to accommodate the learners' needs.

  9. Computer simulation of high resolution transmission electron micrographs: theory and analysis

    International Nuclear Information System (INIS)

    Kilaas, R.

    1985-03-01

    Computer simulation of electron micrographs is an invaluable aid in their proper interpretation and in defining optimum conditions for obtaining images experimentally. Since modern instruments are capable of atomic resolution, simulation techniques employing high precision are required. This thesis makes contributions to four specific areas of this field. First, the validity of a new method for simulating high resolution electron microscope images has been critically examined. Second, three different methods for computing scattering amplitudes in High Resolution Transmission Electron Microscopy (HRTEM) have been investigated as to their ability to include upper Laue layer (ULL) interaction. Third, a new method for computing scattering amplitudes in high resolution transmission electron microscopy has been examined. Fourth, the effect of a surface layer of amorphous silicon dioxide on images of crystalline silicon has been investigated for a range of crystal thicknesses varying from zero to 2 1/2 times that of the surface layer

  10. Electron Scattering in Solid Matter A Theoretical and Computational Treatise

    CERN Document Server

    Zabloudil, Jan; Szunyogh, Laszlo

    2005-01-01

    Addressing graduate students and researchers, this book gives a very detailed theoretical and computational description of multiple scattering in solid matter. Particular emphasis is placed on solids with reduced dimensions, on full potential approaches and on relativistic treatments. For the first time approaches such as the Screened Korringa-Kohn-Rostoker method that have emerged during the last 5 – 10 years are reviewed, considering all formal steps such as single-site scattering, structure constants and screening transformations, and also the numerical point of view. Furthermore, a very general approach is presented for solving the Poisson equation, needed within density functional theory in order to achieve self-consistency. Going beyond ordered matter and translationally invariant systems, special chapters are devoted to the Coherent Potential Approximation and to the Embedded Cluster Method, used, for example, for describing nanostructured matter in real space. In a final chapter, physical properties...

  11. Development of superconductor electronics technology for high-end computing

    Energy Technology Data Exchange (ETDEWEB)

    Silver, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kleinsasser, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kerber, G [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Herr, Q [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Dorojevets, M [Department of Electrical and Computer Engineering, SUNY-Stony Brook, NY 11794-2350 (United States); Bunyk, P [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Abelson, L [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2003-12-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm{sup -2}, 1.25 {mu}m junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s{sup -1}, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.

  12. Development of superconductor electronics technology for high-end computing

    International Nuclear Information System (INIS)

    Silver, A; Kleinsasser, A; Kerber, G; Herr, Q; Dorojevets, M; Bunyk, P; Abelson, L

    2003-01-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm -2 , 1.25 μm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s -1 , both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density

  13. A computer code package for Monte Carlo photon-electron transport simulation Comparisons with experimental benchmarks

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    2000-01-01

    A computer code package (PTSIM) for particle transport Monte Carlo simulation was developed using object oriented techniques of design and programming. A flexible system for simulation of coupled photon, electron transport, facilitating development of efficient simulation applications, was obtained. For photons: Compton and photo-electric effects, pair production and Rayleigh interactions are simulated, while for electrons, a class II condensed history scheme was considered, in which catastrophic interactions (Moeller electron-electron interaction, bremsstrahlung, etc.) are treated in detail and all other interactions with reduced individual effect on electron history are grouped together using continuous slowing down approximation and energy straggling theories. Electron angular straggling is simulated using Moliere theory or a mixed model in which scatters at large angles are treated as distinct events. Comparisons with experimentally benchmarks for electron transmission and bremsstrahlung emissions energy and angular spectra, and for dose calculations are presented

  14. Computational modeling of Repeat1 region of INI1/hSNF5: An evolutionary link with ubiquitin

    Science.gov (United States)

    Bhutoria, Savita

    2016-01-01

    Abstract The structure of a protein can be very informative of its function. However, determining protein structures experimentally can often be very challenging. Computational methods have been used successfully in modeling structures with sufficient accuracy. Here we have used computational tools to predict the structure of an evolutionarily conserved and functionally significant domain of Integrase interactor (INI)1/hSNF5 protein. INI1 is a component of the chromatin remodeling SWI/SNF complex, a tumor suppressor and is involved in many protein‐protein interactions. It belongs to SNF5 family of proteins that contain two conserved repeat (Rpt) domains. Rpt1 domain of INI1 binds to HIV‐1 Integrase, and acts as a dominant negative mutant to inhibit viral replication. Rpt1 domain also interacts with oncogene c‐MYC and modulates its transcriptional activity. We carried out an ab initio modeling of a segment of INI1 protein containing the Rpt1 domain. The structural model suggested the presence of a compact and well defined ββαα topology as core structure in the Rpt1 domain of INI1. This topology in Rpt1 was similar to PFU domain of Phospholipase A2 Activating Protein, PLAA. Interestingly, PFU domain shares similarity with Ubiquitin and has ubiquitin binding activity. Because of the structural similarity between Rpt1 domain of INI1 and PFU domain of PLAA, we propose that Rpt1 domain of INI1 may participate in ubiquitin recognition or binding with ubiquitin or ubiquitin related proteins. This modeling study may shed light on the mode of interactions of Rpt1 domain of INI1 and is likely to facilitate future functional studies of INI1. PMID:27261671

  15. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  16. Computer Tomography from Micro-Electronics to Assembled Products

    Directory of Open Access Journals (Sweden)

    Keith Bryant

    2017-06-01

    Full Text Available Traditional CT in our industry has been limited to Business card sized samples, due to the Cone Beam x-ray systems used by Electronics manufacturing companies. Inclined or Partial CT provides a slightly different solution showing layers or slices in 2D very well, but due to the partial nature of the scans does not produce very accurate 3D reconstructions. This seminar will look at more sophisticated x-ray systems, including dual tube units, which can image at sub-micron level and have the ability to build an accurate and detailed 3D image of a tablet or smart phone without any stitching or joining of images. With high quality reconstruction software, these images can easily be manipulated to allow key features or failure sites to be easily seen. These systems are being used in Failure Analysis but also in NPI and in the design and development process as CAD data can be overlaid and metrology is also possible with some systems.

  17. Evolutionary institutionalism.

    Science.gov (United States)

    Fürstenberg, Dr Kai

    Institutions are hard to define and hard to study. Long prominent in political science have been two theories: Rational Choice Institutionalism (RCI) and Historical Institutionalism (HI). Arising from the life sciences is now a third: Evolutionary Institutionalism (EI). Comparative strengths and weaknesses of these three theories warrant review, and the value-to-be-added by expanding the third beyond Darwinian evolutionary theory deserves consideration. Should evolutionary institutionalism expand to accommodate new understanding in ecology, such as might apply to the emergence of stability, and in genetics, such as might apply to political behavior? Core arguments are reviewed for each theory with more detailed exposition of the third, EI. Particular attention is paid to EI's gene-institution analogy; to variation, selection, and retention of institutional traits; to endogeneity and exogeneity; to agency and structure; and to ecosystem effects, institutional stability, and empirical limitations in behavioral genetics. RCI, HI, and EI are distinct but complementary. Institutional change, while amenable to rational-choice analysis and, retrospectively, to criticaljuncture and path-dependency analysis, is also, and importantly, ecological. Stability, like change, is an emergent property of institutions, which tend to stabilize after change in a manner analogous to allopatric speciation. EI is more than metaphorically biological in that institutional behaviors are driven by human behaviors whose evolution long preceded the appearance of institutions themselves.

  18. Open Issues in Evolutionary Robotics.

    Science.gov (United States)

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  19. Quality-of-service sensitivity to bio-inspired/evolutionary computational methods for intrusion detection in wireless ad hoc multimedia sensor networks

    Science.gov (United States)

    Hortos, William S.

    2012-06-01

    In the author's previous work, a cross-layer protocol approach to wireless sensor network (WSN) intrusion detection an identification is created with multiple bio-inspired/evolutionary computational methods applied to the functions of the protocol layers, a single method to each layer, to improve the intrusion-detection performance of the protocol over that of one method applied to only a single layer's functions. The WSN cross-layer protocol design embeds GAs, anti-phase synchronization, ACO, and a trust model based on quantized data reputation at the physical, MAC, network, and application layer, respectively. The construct neglects to assess the net effect of the combined bioinspired methods on the quality-of-service (QoS) performance for "normal" data streams, that is, streams without intrusions. Analytic expressions of throughput, delay, and jitter, coupled with simulation results for WSNs free of intrusion attacks, are the basis for sensitivity analyses of QoS metrics for normal traffic to the bio-inspired methods.

  20. Studies of electron collisions with polyatomic molecules using distributed-memory parallel computers

    International Nuclear Information System (INIS)

    Winstead, C.; Hipes, P.G.; Lima, M.A.P.; McKoy, V.

    1991-01-01

    Elastic electron scattering cross sections from 5--30 eV are reported for the molecules C 2 H 4 , C 2 H 6 , C 3 H 8 , Si 2 H 6 , and GeH 4 , obtained using an implementation of the Schwinger multichannel method for distributed-memory parallel computer architectures. These results, obtained within the static-exchange approximation, are in generally good agreement with the available experimental data. These calculations demonstrate the potential of highly parallel computation in the study of collisions between low-energy electrons and polyatomic gases. The computational methodology discussed is also directly applicable to the calculation of elastic cross sections at higher levels of approximation (target polarization) and of electronic excitation cross sections

  1. Quantum computation in semiconductor quantum dots of electron-spin asymmetric anisotropic exchange

    International Nuclear Information System (INIS)

    Hao Xiang; Zhu Shiqun

    2007-01-01

    The universal quantum computation is obtained when there exists asymmetric anisotropic exchange between electron spins in coupled semiconductor quantum dots. The asymmetric Heisenberg model can be transformed into the isotropic model through the control of two local unitary rotations for the realization of essential quantum gates. The rotations on each qubit are symmetrical and depend on the strength and orientation of asymmetric exchange. The implementation of the axially symmetric local magnetic fields can assist the construction of quantum logic gates in anisotropic coupled quantum dots. This proposal can efficiently use each physical electron spin as a logical qubit in the universal quantum computation

  2. Gersch-Rodriguez-Smith computation of deep inelastic electron scattering on 4He

    International Nuclear Information System (INIS)

    Viviani, M.; Kievsky, A.; Rinat, A.S.

    2003-01-01

    We compute cross sections for inclusive scattering of high-energy electrons on 4 He, based on the two lowest orders of the Gersch-Rodriguez-Smith series. The required one- and two-particle density matrices are obtained from nonrelativistic 4 He wave functions using realistic models for the nucleon-nucleon and three-nucleon interaction. The computed results for E=3.6 GeV agree well with the NE3 SLAC-Virginia data

  3. The utilization of electronic computers for bone density measurements with iodine 125 profile scanner

    International Nuclear Information System (INIS)

    Reiners, C.

    1974-01-01

    The utilization of electronic computers in the determination of the mineral content in bone with the 125 I profile scanner offers many advantages. The computer considerably lessens intensive work of routine evaluation. It enables the direct calculation of the attenuation coefficients. This means a greater accuracy and correctness of the results compared to the former 'graphical' method, as the approximations are eliminated and reference errors are avoided. (orig./LH) [de

  4. Evaluating Electronic Customer Relationship Management Performance: Case Studies from Persian Automotive and Computer Industry

    OpenAIRE

    Safari, Narges; Safari, Fariba; Olesen, Karin; Shahmehr, Fatemeh

    2016-01-01

    This research paper investigates the influence of industry on electronic customer relationship management (e-CRM) performance. A case study approach with two cases was applied to evaluate the influence of e-CRM on customer behavioral and attitudinal loyalty along with customer pyramid. The cases covered two industries consisting of computer and automotive industries. For investigating customer behavioral loyalty and customer pyramid companies database were computed while for examining custome...

  5. Quantum computers based on electron spins controlled by ultrafast off-resonant single optical pulses.

    Science.gov (United States)

    Clark, Susan M; Fu, Kai-Mei C; Ladd, Thaddeus D; Yamamoto, Yoshihisa

    2007-07-27

    We describe a fast quantum computer based on optically controlled electron spins in charged quantum dots that are coupled to microcavities. This scheme uses broadband optical pulses to rotate electron spins and provide the clock signal to the system. Nonlocal two-qubit gates are performed by phase shifts induced by electron spins on laser pulses propagating along a shared waveguide. Numerical simulations of this scheme demonstrate high-fidelity single-qubit and two-qubit gates with operation times comparable to the inverse Zeeman frequency.

  6. Simulation of electronic structure Hamiltonians in a superconducting quantum computer architecture

    Energy Technology Data Exchange (ETDEWEB)

    Kaicher, Michael; Wilhelm, Frank K. [Theoretical Physics, Saarland University, 66123 Saarbruecken (Germany); Love, Peter J. [Department of Physics, Haverford College, Haverford, Pennsylvania 19041 (United States)

    2015-07-01

    Quantum chemistry has become one of the most promising applications within the field of quantum computation. Simulating the electronic structure Hamiltonian (ESH) in the Bravyi-Kitaev (BK)-Basis to compute the ground state energies of atoms/molecules reduces the number of qubit operations needed to simulate a single fermionic operation to O(log(n)) as compared to O(n) in the Jordan-Wigner-Transformation. In this work we will present the details of the BK-Transformation, show an example of implementation in a superconducting quantum computer architecture and compare it to the most recent quantum chemistry algorithms suggesting a constant overhead.

  7. The Need for Optical Means as an Alternative for Electronic Computing

    Science.gov (United States)

    Adbeldayem, Hossin; Frazier, Donald; Witherow, William; Paley, Steve; Penn, Benjamin; Bank, Curtis; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    An increasing demand for faster computers is rapidly growing to encounter the fast growing rate of Internet, space communication, and robotic industry. Unfortunately, the Very Large Scale Integration technology is approaching its fundamental limits beyond which the device will be unreliable. Optical interconnections and optical integrated circuits are strongly believed to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by conventional electronics. This paper demonstrates two ultra-fast, all-optical logic gates and a high-density storage medium, which are essential components in building the future optical computer.

  8. Computation of electron cloud diagnostics and mitigation in the main injector

    International Nuclear Information System (INIS)

    Veitzer, S A; Cary, J R; Stoltz, P H; LeBrun, P; Spentzouris, P; Amundson, J F

    2009-01-01

    High-performance computations on Blue Gene/P at Argonne's Leadership Computing Facility have been used to determine phase shifts induced in injected RF diagnostics as a function of electron cloud density in the Main Injector. Inversion of the relationship between electron cloud parameters and induced phase shifts allows us to predict electron cloud density and evolution over many bunch periods. Long time-scale simulations using Blue Gene have allowed us to measure cloud evolution patterns under the influence of beam propagation with realistic physical parameterizations, such as elliptical beam pipe geometry, self-consistent electromagnetic fields, space charge, secondary electron emission, and the application of arbitrary external magnetic fields. Simultaneously, we are able to simulate the use of injected microwave diagnostic signals to measure electron cloud density, and the effectiveness of various mitigation techniques such as surface coating and the application of confining magnetic fields. These simulations provide a baseline for both RF electron cloud diagnostic design and accelerator fabrication in order to measure electron clouds and mitigate the adverse effects of such clouds on beam propagation.

  9. Computational methods for constructing protein structure models from 3D electron microscopy maps.

    Science.gov (United States)

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2013-10-01

    Protein structure determination by cryo-electron microscopy (EM) has made significant progress in the past decades. Resolutions of EM maps have been improving as evidenced by recently reported structures that are solved at high resolutions close to 3Å. Computational methods play a key role in interpreting EM data. Among many computational procedures applied to an EM map to obtain protein structure information, in this article we focus on reviewing computational methods that model protein three-dimensional (3D) structures from a 3D EM density map that is constructed from two-dimensional (2D) maps. The computational methods we discuss range from de novo methods, which identify structural elements in an EM map, to structure fitting methods, where known high resolution structures are fit into a low-resolution EM map. A list of available computational tools is also provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Neuromorphic computing enabled by physics of electron spins: Prospects and perspectives

    Science.gov (United States)

    Sengupta, Abhronil; Roy, Kaushik

    2018-03-01

    “Spintronics” refers to the understanding of the physics of electron spin-related phenomena. While most of the significant advancements in this field has been driven primarily by memory, recent research has demonstrated that various facets of the underlying physics of spin transport and manipulation can directly mimic the functionalities of the computational primitives in neuromorphic computation, i.e., the neurons and synapses. Given the potential of these spintronic devices to implement bio-mimetic computations at very low terminal voltages, several spin-device structures have been proposed as the core building blocks of neuromorphic circuits and systems to implement brain-inspired computing. Such an approach is expected to play a key role in circumventing the problems of ever-increasing power dissipation and hardware requirements for implementing neuro-inspired algorithms in conventional digital CMOS technology. Perspectives on spin-enabled neuromorphic computing, its status, and challenges and future prospects are outlined in this review article.

  11. 78 FR 63492 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2013-10-24

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...

  12. Assessment of coronary artery bypass graft patency by multidetector computed tomography and electron-beam tomography

    NARCIS (Netherlands)

    Piers, LH; Dorgelo, J; Tio, RA; Jessurun, GAJ; Oudkerk, M; Zijlstra, F

    This case report describes the use of retrospectively ECG-gated 16-slice multidetector computed tomography (MDCT) and electron-beam tomography (EBT) for assessing bypass graft patency in two patients with recurrent angina after coronary artery bypass graft surgery. The results of each tomographic

  13. Separation of electron ion ring components (computational simulation and experimental results)

    International Nuclear Information System (INIS)

    Aleksandrov, V.S.; Dolbilov, G.V.; Kazarinov, N.Yu.; Mironov, V.I.; Novikov, V.G.; Perel'shtejn, Eh.A.; Sarantsev, V.P.; Shevtsov, V.F.

    1978-01-01

    The problems of the available polarization value of electron-ion rings in the regime of acceleration and separation of its components at the final stage of acceleration are studied. The results of computational simulation by use of the macroparticle method and experiments on the ring acceleration and separation are given. The comparison of calculation results with experiment is presented

  14. COMPUTATIONAL ELECTROCHEMISTRY: AQUEOUS ONE-ELECTRON OXIDATION POTENTIALS FOR SUBSTITUTED ANILINES

    Science.gov (United States)

    Semiempirical molecular orbital theory and density functional theory are used to compute one-electron oxidation potentials for aniline and a set of 21 mono- and di-substituted anilines in aqueous solution. Linear relationships between theoretical predictions and experiment are co...

  15. An approach to first principles electronic structure calculation by symbolic-numeric computation

    Directory of Open Access Journals (Sweden)

    Akihito Kikuchi

    2013-04-01

    Full Text Available There is a wide variety of electronic structure calculation cooperating with symbolic computation. The main purpose of the latter is to play an auxiliary role (but not without importance to the former. In the field of quantum physics [1-9], researchers sometimes have to handle complicated mathematical expressions, whose derivation seems almost beyond human power. Thus one resorts to the intensive use of computers, namely, symbolic computation [10-16]. Examples of this can be seen in various topics: atomic energy levels, molecular dynamics, molecular energy and spectra, collision and scattering, lattice spin models and so on [16]. How to obtain molecular integrals analytically or how to manipulate complex formulas in many body interactions, is one such problem. In the former, when one uses special atomic basis for a specific purpose, to express the integrals by the combination of already known analytic functions, may sometimes be very difficult. In the latter, one must rearrange a number of creation and annihilation operators in a suitable order and calculate the analytical expectation value. It is usual that a quantitative and massive computation follows a symbolic one; for the convenience of the numerical computation, it is necessary to reduce a complicated analytic expression into a tractable and computable form. This is the main motive for the introduction of the symbolic computation as a forerunner of the numerical one and their collaboration has won considerable successes. The present work should be classified as one such trial. Meanwhile, the use of symbolic computation in the present work is not limited to indirect and auxiliary part to the numerical computation. The present work can be applicable to a direct and quantitative estimation of the electronic structure, skipping conventional computational methods.

  16. Psychiatrists' Comfort Using Computers and Other Electronic Devices in Clinical Practice.

    Science.gov (United States)

    Duffy, Farifteh F; Fochtmann, Laura J; Clarke, Diana E; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K; Plovnick, Robert M

    2016-09-01

    This report highlights findings from the Study of Psychiatrists' Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists' comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted.

  17. Psychiatrists’ Comfort Using Computers and Other Electronic Devices in Clinical Practice

    Science.gov (United States)

    Fochtmann, Laura J.; Clarke, Diana E.; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K.; Plovnick, Robert M.

    2015-01-01

    This report highlights findings from the Study of Psychiatrists’ Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists’ comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted. PMID:26667248

  18. Exploring Tradeoffs in Demand-Side and Supply-Side Management of Urban Water Resources Using Agent-Based Modeling and Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Lufthansa Kanta

    2015-11-01

    Full Text Available Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger: (1 increases in the volume of water pumped through inter-basin transfers from an external reservoir; and (2 drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  19. Electron correlation in molecules: concurrent computation Many-Body Perturbation Theory (ccMBPT) calculations using macrotasking on the NEC SX-3/44 computer

    International Nuclear Information System (INIS)

    Moncrieff, D.; Wilson, S.

    1992-06-01

    The ab initio determination of the electronic structure of molecules is a many-fermion problem involving the approximate description of the motion of the electrons in the field of fixed nuclei. It is an area of research which demands considerable computational resources but having enormous potential in fields as diverse as interstellar chemistry and drug design, catalysis and solid state chemistry, molecular biology and environmental chemistry. Electronic structure calculations almost invariably divide into two main stages: the approximate solution of an independent electron model, in which each electron moves in the average field created by the other electrons in the system, and then, the more computationally demanding determination of a series of corrections to this model, the electron correlation effects. The many-body perturbation theory expansion affords a systematic description of correlation effects, which leads directly to algorithms which are suitable for concurrent computation. We term this concurrent computation Many-Body Perturbation Theory (ccMBPT). The use of a dynamic load balancing technique on the NEC SX-3/44 computer in electron correlation calculations is investigated for the calculation of the most demanding energy component in the most accurate of contemporary ab initio studies. An application to the ground state of the nitrogen molecule is described. We also briefly discuss the extent to which the calculation of the dominant corrections to such studies can be rendered computationally tractable by exploiting both the vector processing and parallel processor capabilities of the NEC SX-3/44 computer. (author)

  20. Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.

    Science.gov (United States)

    Ahmadi, Maryam; Aslani, Nasim

    2018-01-01

    With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.

  1. Spore: Spawning Evolutionary Misconceptions?

    Science.gov (United States)

    Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.

    2010-10-01

    The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.

  2. Molecular Computational Investigation of Electron Transfer Kinetics across Cytochrome-Iron Oxide Interfaces

    International Nuclear Information System (INIS)

    Kerisit, Sebastien N.; Rosso, Kevin M.; Dupuis, Michel; Valiev, Marat

    2007-01-01

    The interface between electron transfer proteins such as cytochromes and solid phase mineral oxides is central to the activity of dissimilatory-metal reducing bacteria. A combination of potential-based molecular dynamics simulations and ab initio electronic structure calculations are used in the framework of Marcus' electron transfer theory to compute elementary electron transfer rates from a well-defined cytochrome model, namely the small tetraheme cytochrome (STC) from Shewanella oneidensis, to surfaces of the iron oxide mineral hematite (a-Fe2O3). Room temperature molecular dynamics simulations show that an isolated STC molecule favors surface attachment via direct contact of hemes I and IV at the poles of the elongated axis, with electron transfer distances as small as 9 Angstroms. The cytochrome remains attached to the mineral surface in the presence of water and shows limited surface diffusion at the interface. Ab initio electronic coupling matrix element (VAB) calculations of configurations excised from the molecular dynamics simulations reveal VAB values ranging from 1 to 20 cm-1, consistent with nonadiabaticity. Using these results, together with experimental data on the redox potential of hematite and hemes in relevant cytochromes and calculations of the reorganization energy from cluster models, we estimate the rate of electron transfer across this model interface to range from 1 to 1000 s-1 for the most exothermic driving force considered in this work, and from 0.01 to 20 s-1 for the most endothermic. This fairly large range of electron transfer rates highlights the sensitivity of the rate upon the electronic coupling matrix element, which is in turn dependent on the fluctuations of the heme configuration at the interface. We characterize this dependence using an idealized bis-imidazole heme to compute from first principles the VAB variation due to porphyrin ring orientation, electron transfer distance, and mineral surface termination. The electronic

  3. Reconciliation of the cloud computing model with US federal electronic health record regulations.

    Science.gov (United States)

    Schweitzer, Eugene J

    2012-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.

  4. Reconciliation of the cloud computing model with US federal electronic health record regulations

    Science.gov (United States)

    2011-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204

  5. Efficient and Flexible Computation of Many-Electron Wave Function Overlaps.

    Science.gov (United States)

    Plasser, Felix; Ruckenbauer, Matthias; Mai, Sebastian; Oppel, Markus; Marquetand, Philipp; González, Leticia

    2016-03-08

    A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented.

  6. Current-voltage curves for molecular junctions computed using all-electron basis sets

    International Nuclear Information System (INIS)

    Bauschlicher, Charles W.; Lawson, John W.

    2006-01-01

    We present current-voltage (I-V) curves computed using all-electron basis sets on the conducting molecule. The all-electron results are very similar to previous results obtained using effective core potentials (ECP). A hybrid integration scheme is used that keeps the all-electron calculations cost competitive with respect to the ECP calculations. By neglecting the coupling of states to the contacts below a fixed energy cutoff, the density matrix for the core electrons can be evaluated analytically. The full density matrix is formed by adding this core contribution to the valence part that is evaluated numerically. Expanding the definition of the core in the all-electron calculations significantly reduces the computational effort and, up to biases of about 2 V, the results are very similar to those obtained using more rigorous approaches. The convergence of the I-V curves and transmission coefficients with respect to basis set is discussed. The addition of diffuse functions is critical in approaching basis set completeness

  7. Radiation defects in Te-implanted germanium. Electron microscopy and computer simulation studies

    International Nuclear Information System (INIS)

    Kalitzova, M.G.; Karpuzov, D.S.; Pashov, N.K.

    1985-01-01

    Direct observation of radiation damage induced by heavy ion implantation in crystalline germanium by means of high-resolution electron microscopy is reported. The dark-field lattice imaging mode is used, under conditions suitable for object-like imaging. Conventional TEM is used for estimating the efficiency of creating visibly damaged regions. Heavy ion damage clusters with three types of inner structure are observed: with near-perfect crystalline cores, and with metastable and stable amorphous cores. The MARLOWE computer code is used to simulate the atomic collision cascades and to obtain the lateral spread distributions of point defects created. A comparison of high-resolution electron microscopy (HREM) with computer simulation results shows encouraging agreement for the average cluster dimensions and for the lateral spread of vacancies and interstitials. (author)

  8. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form a pe...... tests. The presented algorithm is applied to urban traffic signal timing optimization and the effect is satisfied....

  9. Plant Layout Analysis by Computer Simulation for Electronic Manufacturing Service Plant

    OpenAIRE

    Visuwan D.; Phruksaphanrat B

    2014-01-01

    In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyze and de...

  10. Full surface examination of small spheres with a computer controlled scanning electron microscope

    International Nuclear Information System (INIS)

    Ward, C.M.; Willenborg, D.L.; Montgomery, K.L.

    1979-01-01

    This report discusses a computer automated stage and Scanning Electron Microscopy (SEM) system for detecting defects in glass spheres for inertial confinement laser fusion experiments. This system detects submicron defects and permits inclusion of acceptable spheres in targets after examination. The stage used to examine and manipulate the spheres through 4π steradians is described. Primary image recording is made on a roster scanning video disc. The need for SEM stability and methods of achieving it are discussed

  11. A computer-controlled electronic system for the ultrasonic NDT of components for nuclear power stations

    International Nuclear Information System (INIS)

    Rehrmann, M.; Harbecke, D.

    1987-01-01

    The paper describes an automatic ultrasonic testing system combined with a computer-controlled electronics system, called IMPULS I, for the non-destructive testing of components of nuclear reactors. The system can be used for both in-service inspection and for inspection during the manufacturing process. IMPUL I has more functions and less components than conventional ultrasonic systems, and the system gives good reproducible test results and is easy to operate. (U.K.)

  12. Maximal thickness of the normal human pericardium assessed by electron-beam computed tomography

    International Nuclear Information System (INIS)

    Delille, J.P.; Hernigou, A.; Sene, V.; Chatellier, G.; Boudeville, J.C.; Challande, P.; Plainfosse, M.C.

    1999-01-01

    The purpose of this study was to determine the maximal value of normal pericardial thickness with an electron-beam computed tomography unit allowing fast scan times of 100 ms to reduce cardiac motion artifacts. Electron-beam computed tomography was performed in 260 patients with hypercholesterolemia and/or hypertension, as these pathologies have no effect on pericardial thickness. The pixel size was 0.5 mm. Measurements could be performed in front of the right ventricle, the right atrioventricular groove, the right atrium, the left ventricle, and the interventricular groove. Maximal thickness of normal pericardium was defined at the 95th percentile. Inter-observer and intra-observer reproducibility studies were assessed from additional CT scans by the Bland and Altman method [24]. The maximal thickness of the normal pericardium was 2 mm for 95 % of cases. For the reproducibility studies, there was no significant relationship between the inter-observer and intra-observer measurements, but all pericardial thickness measurements were ≤ 1.6 mm. Using electron-beam computed tomography, which assists in decreasing substantially cardiac motion artifacts, the threshold of detection of thickened pericardium is statistically established as being 2 mm for 95 % of the patients with hypercholesterolemia and/or hypertension. However, the spatial resolution available prevents a reproducible measure of the real thickness of thin pericardium. (orig.)

  13. Maximal thickness of the normal human pericardium assessed by electron-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Delille, J.P.; Hernigou, A.; Sene, V.; Chatellier, G.; Boudeville, J.C.; Challande, P.; Plainfosse, M.C. [Service de Radiologie Centrale, Hopital Broussais, Paris (France)

    1999-08-01

    The purpose of this study was to determine the maximal value of normal pericardial thickness with an electron-beam computed tomography unit allowing fast scan times of 100 ms to reduce cardiac motion artifacts. Electron-beam computed tomography was performed in 260 patients with hypercholesterolemia and/or hypertension, as these pathologies have no effect on pericardial thickness. The pixel size was 0.5 mm. Measurements could be performed in front of the right ventricle, the right atrioventricular groove, the right atrium, the left ventricle, and the interventricular groove. Maximal thickness of normal pericardium was defined at the 95th percentile. Inter-observer and intra-observer reproducibility studies were assessed from additional CT scans by the Bland and Altman method [24]. The maximal thickness of the normal pericardium was 2 mm for 95 % of cases. For the reproducibility studies, there was no significant relationship between the inter-observer and intra-observer measurements, but all pericardial thickness measurements were {<=} 1.6 mm. Using electron-beam computed tomography, which assists in decreasing substantially cardiac motion artifacts, the threshold of detection of thickened pericardium is statistically established as being 2 mm for 95 % of the patients with hypercholesterolemia and/or hypertension. However, the spatial resolution available prevents a reproducible measure of the real thickness of thin pericardium. (orig.) With 6 figs., 1 tab., 31 refs.

  14. Rational design of metal-organic electronic devices: A computational perspective

    Science.gov (United States)

    Chilukuri, Bhaskar

    Organic and organometallic electronic materials continue to attract considerable attention among researchers due to their cost effectiveness, high flexibility, low temperature processing conditions and the continuous emergence of new semiconducting materials with tailored electronic properties. In addition, organic semiconductors can be used in a variety of important technological devices such as solar cells, field-effect transistors (FETs), flash memory, radio frequency identification (RFID) tags, light emitting diodes (LEDs), etc. However, organic materials have thus far not achieved the reliability and carrier mobility obtainable with inorganic silicon-based devices. Hence, there is a need for finding alternative electronic materials other than organic semiconductors to overcome the problems of inferior stability and performance. In this dissertation, I research the development of new transition metal based electronic materials which due to the presence of metal-metal, metal-pi, and pi-pi interactions may give rise to superior electronic and chemical properties versus their organic counterparts. Specifically, I performed computational modeling studies on platinum based charge transfer complexes and d 10 cyclo-[M(mu-L)]3 trimers (M = Ag, Au and L = monoanionic bidentate bridging (C/N~C/N) ligand). The research done is aimed to guide experimental chemists to make rational choices of metals, ligands, substituents in synthesizing novel organometallic electronic materials. Furthermore, the calculations presented here propose novel ways to tune the geometric, electronic, spectroscopic, and conduction properties in semiconducting materials. In addition to novel material development, electronic device performance can be improved by making a judicious choice of device components. I have studied the interfaces of a p-type metal-organic semiconductor viz cyclo-[Au(mu-Pz)] 3 trimer with metal electrodes at atomic and surface levels. This work was aimed to guide the device

  15. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  16. Design Principles for the Atomic and Electronic Structure of Halide Perovskite Photovoltaic Materials: Insights from Computation.

    Science.gov (United States)

    Berger, Robert F

    2018-02-09

    In the current decade, perovskite solar cell research has emerged as a remarkably active, promising, and rapidly developing field. Alongside breakthroughs in synthesis and device engineering, halide perovskite photovoltaic materials have been the subject of predictive and explanatory computational work. In this Minireview, we focus on a subset of this computation: density functional theory (DFT)-based work highlighting the ways in which the electronic structure and band gap of this class of materials can be tuned via changes in atomic structure. We distill this body of computational literature into a set of underlying design principles for the band gap engineering of these materials, and rationalize these principles from the viewpoint of band-edge orbital character. We hope that this perspective provides guidance and insight toward the rational design and continued improvement of perovskite photovoltaics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Some algorithms for the solution of the symmetric eigenvalue problem on a multiprocessor electronic computer

    International Nuclear Information System (INIS)

    Molchanov, I.N.; Khimich, A.N.

    1984-01-01

    This article shows how a reflection method can be used to find the eigenvalues of a matrix by transforming the matrix to tridiagonal form. The method of conjugate gradients is used to find the smallest eigenvalue and the corresponding eigenvector of symmetric positive-definite band matrices. Topics considered include the computational scheme of the reflection method, the organization of parallel calculations by the reflection method, the computational scheme of the conjugate gradient method, the organization of parallel calculations by the conjugate gradient method, and the effectiveness of parallel algorithms. It is concluded that it is possible to increase the overall effectiveness of the multiprocessor electronic computers by either letting the newly available processors of a new problem operate in the multiprocessor mode, or by improving the coefficient of uniform partition of the original information

  18. Fundamentals of natural computing basic concepts, algorithms, and applications

    CERN Document Server

    de Castro, Leandro Nunes

    2006-01-01

    Introduction A Small Sample of Ideas The Philosophy of Natural Computing The Three Branches: A Brief Overview When to Use Natural Computing Approaches Conceptualization General Concepts PART I - COMPUTING INSPIRED BY NATURE Evolutionary Computing Problem Solving as a Search Task Hill Climbing and Simulated Annealing Evolutionary Biology Evolutionary Computing The Other Main Evolutionary Algorithms From Evolutionary Biology to Computing Scope of Evolutionary Computing Neurocomputing The Nervous System Artif

  19. Computational Benchmarking for Ultrafast Electron Dynamics: Wave Function Methods vs Density Functional Theory.

    Science.gov (United States)

    Oliveira, Micael J T; Mignolet, Benoit; Kus, Tomasz; Papadopoulos, Theodoros A; Remacle, F; Verstraete, Matthieu J

    2015-05-12

    Attosecond electron dynamics in small- and medium-sized molecules, induced by an ultrashort strong optical pulse, is studied computationally for a frozen nuclear geometry. The importance of exchange and correlation effects on the nonequilibrium electron dynamics induced by the interaction of the molecule with the strong optical pulse is analyzed by comparing the solution of the time-dependent Schrödinger equation based on the correlated field-free stationary electronic states computed with the equationof-motion coupled cluster singles and doubles and the complete active space multi-configurational self-consistent field methodologies on one hand, and various functionals in real-time time-dependent density functional theory (TDDFT) on the other. We aim to evaluate the performance of the latter approach, which is very widely used for nonlinear absorption processes and whose computational cost has a more favorable scaling with the system size. We focus on LiH as a toy model for a nontrivial molecule and show that our conclusions carry over to larger molecules, exemplified by ABCU (C10H19N). The molecules are probed with IR and UV pulses whose intensities are not strong enough to significantly ionize the system. By comparing the evolution of the time-dependent field-free electronic dipole moment, as well as its Fourier power spectrum, we show that TD-DFT performs qualitatively well in most cases. Contrary to previous studies, we find almost no changes in the TD-DFT excitation energies when excited states are populated. Transitions between states of different symmetries are induced using pulses polarized in different directions. We observe that the performance of TD-DFT does not depend on the symmetry of the states involved in the transition.

  20. Computation of quantum electron transport with local current conservation using quantum trajectories

    International Nuclear Information System (INIS)

    Alarcón, A; Oriols, X

    2009-01-01

    A recent proposal for modeling time-dependent quantum electron transport with Coulomb and exchange correlations using quantum (Bohm) trajectories (Oriols 2007 Phys. Rev. Lett. 98 066803) is extended towards the computation of the total (particle plus displacement) current in mesoscopic devices. In particular, two different methods for the practical computation of the total current are compared. The first method computes the particle and the displacement currents from the rate of Bohm particles crossing a particular surface and the time-dependent variations of the electric field there. The second method uses the Ramo–Shockley theorem to compute the total current on that surface from the knowledge of the Bohm particle dynamics in a 3D volume and the time-dependent variations of the electric field on the boundaries of that volume. From a computational point of view, it is shown that both methods achieve local current conservation, but the second is preferred because it is free from 'spurious' peaks. A numerical example, a Bohm trajectory crossing a double-barrier tunneling structure, is presented, supporting the conclusions

  1. Automated processing of dynamic properties of intraventricular pressure by computer program and electronic circuit.

    Science.gov (United States)

    Adler, D; Mahler, Y

    1980-04-01

    A procedure for automatic detection and digital processing of the maximum first derivative of the intraventricular pressure (dp/dtmax), time to dp/dtmax(t - dp/dt) and beat-to-beat intervals have been developed. The procedure integrates simple electronic circuits with a short program using a simple algorithm for the detection of the points of interest. The tasks of differentiating the pressure signal and detecting the onset of contraction were done by electronics, while the tasks of finding the values of dp/dtmax, t - dp/dt, beat-to-beat intervals and all computations needed were done by software. Software/hardware 'trade off' considerations and the accuracy and reliability of the system are discussed.

  2. Gradient ascent pulse engineering approach to CNOT gates in donor electron spin quantum computing

    International Nuclear Information System (INIS)

    Tsai, D.-B.; Goan, H.-S.

    2008-01-01

    In this paper, we demonstrate how gradient ascent pulse engineering (GRAPE) optimal control methods can be implemented on donor electron spin qubits in semiconductors with an architecture complementary to the original Kane's proposal. We focus on the high fidelity controlled-NOT (CNOT) gate and we explicitly find the digitized control sequences for a controlled-NOT gate by optimizing its fidelity using the effective, reduced donor electron spin Hamiltonian with external controls over the hyperfine A and exchange J interactions. We then simulate the CNOT-gate sequence with the full spin Hamiltonian and find that it has an error of 10 -6 that is below the error threshold of 10 -4 required for fault-tolerant quantum computation. Also the CNOT gate operation time of 100 ns is 3 times faster than 297 ns of the proposed global control scheme.

  3. A computational study of the electronic properties of one-dimensional armchair phosphorene nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Sheng; Zhu, Hao; Eshun, Kwesi; Arab, Abbas; Badwan, Ahmad; Li, Qiliang [Department of Electrical and Computer Engineering, George Mason University, Fairfax, Virginia 22033 (United States)

    2015-10-28

    We have performed a comprehensive first-principle computational study of the electronic properties of one-dimensional phosphorene nanotubes (PNTs), and the strain effect on the mechanical and electrical properties of PNTs, including the elastic modulus, energy bandstructure, and carrier effective mass. The study has demonstrated that the armchair PNTs have semiconducting properties along the axial direction and the carrier mobility can be significantly improved by compressive strain. The hole mobility increases from 40.7 cm{sup 2}/V s to 197.0 cm{sup 2}/V s as the compressive strain increases to −5% at room temperature. The investigations of size effect on armchair PNTs indicated that the conductance increases significantly as the increasing diameter. Overall, this study indicated that the PNTs have very attractive electronic properties for future application in nanomaterials and devices.

  4. Reproducibility of coronary calcification detection with electron-beam computed tomography

    International Nuclear Information System (INIS)

    Hernigou, A.; Challande, P.; Boudeville, J.C.; Sene, V.; Grataloup, C.; Plainfosse, M.

    1996-01-01

    If coronary calcification scores obtained with electron-beam computed tomography (EBT) were proved to be correlated to coronary atherosclerosis, the reproducibility of the technique had to be assessed before being useed for patient follow-up. A total of 150 patients, selected as a result of a cholesterol screening programme, were studied by EBT. Twelve contiguous 3-mm-thick transverse slices beginning on the proximal coronary arteries were obtained through the base of the heart. The amount of calcium was evaluated as the calcified area weighted by a coefficient depending on the density peak level. The value was expressed as a logarithmic scale. Intra-observer, inter-observer and inter-examination reproducibilities were calculated. They were 1.9, 1.3 and 7.2%, respectively. These results were good enough to allow the use of EBT for longitudinal studies. The influence of acquisition and calculation conditions on score computation were also analysed. (orig.)

  5. Identify and rank key factors influencing the adoption of cloud computing for a healthy Electronics

    Directory of Open Access Journals (Sweden)

    Javad Shukuhy

    2015-02-01

    Full Text Available Cloud computing as a new technology with Internet infrastructure and new approaches can be significant benefits in providing medical services electronically. Aplying this technology in E-Health requires consideration of various factors. The main objective of this study is to identify and rank the factors influencing the adoption of e-health cloud. Based on the Technology-Organization-Environment (TOE framework and Human-Organization-Technology fit (HOT-fit model, 16 sub-factors were identified in four major factors. With survey of 60 experts, academics and experts in health information technology and with the help of fuzzy analytic hierarchy process had ranked these sub-factors and factors. In the literature, considering newness this study, no internal or external study, have not alluded these number of criteria. The results show that when deciding to adopt cloud computing in E-Health, respectively, must be considered technological, human, organizational and environmental factors.

  6. An electron beam linear scanning mode for industrial limited-angle nano-computed tomography

    Science.gov (United States)

    Wang, Chengxiang; Zeng, Li; Yu, Wei; Zhang, Lingli; Guo, Yumeng; Gong, Changcheng

    2018-01-01

    Nano-computed tomography (nano-CT), which utilizes X-rays to research the inner structure of some small objects and has been widely utilized in biomedical research, electronic technology, geology, material sciences, etc., is a high spatial resolution and non-destructive research technique. A traditional nano-CT scanning model with a very high mechanical precision and stability of object manipulator, which is difficult to reach when the scanned object is continuously rotated, is required for high resolution imaging. To reduce the scanning time and attain a stable and high resolution imaging in industrial non-destructive testing, we study an electron beam linear scanning mode of nano-CT system that can avoid mechanical vibration and object movement caused by the continuously rotated object. Furthermore, to further save the scanning time and study how small the scanning range could be considered with acceptable spatial resolution, an alternating iterative algorithm based on ℓ0 minimization is utilized to limited-angle nano-CT reconstruction problem with the electron beam linear scanning mode. The experimental results confirm the feasibility of the electron beam linear scanning mode of nano-CT system.

  7. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime

    Energy Technology Data Exchange (ETDEWEB)

    Baart, T. A.; Vandersypen, L. M. K. [QuTech, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Kavli Institute of Nanoscience, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Eendebak, P. T. [QuTech, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Netherlands Organisation for Applied Scientific Research (TNO), P.O. Box 155, 2600 AD Delft (Netherlands); Reichl, C.; Wegscheider, W. [Solid State Physics Laboratory, ETH Zürich, 8093 Zürich (Switzerland)

    2016-05-23

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the double quantum dots into the single-electron regime. The algorithm only requires (1) prior knowledge of the gate design and (2) the pinch-off value of the single gate T that is shared by all the quantum dots. This work significantly alleviates the user effort required to tune multiple quantum dot devices.

  8. The ELPA library: scalable parallel eigenvalue solutions for electronic structure theory and computational science.

    Science.gov (United States)

    Marek, A; Blum, V; Johanni, R; Havu, V; Lang, B; Auckenthaler, T; Heinecke, A; Bungartz, H-J; Lederer, H

    2014-05-28

    Obtaining the eigenvalues and eigenvectors of large matrices is a key problem in electronic structure theory and many other areas of computational science. The computational effort formally scales as O(N(3)) with the size of the investigated problem, N (e.g. the electron count in electronic structure theory), and thus often defines the system size limit that practical calculations cannot overcome. In many cases, more than just a small fraction of the possible eigenvalue/eigenvector pairs is needed, so that iterative solution strategies that focus only on a few eigenvalues become ineffective. Likewise, it is not always desirable or practical to circumvent the eigenvalue solution entirely. We here review some current developments regarding dense eigenvalue solvers and then focus on the Eigenvalue soLvers for Petascale Applications (ELPA) library, which facilitates the efficient algebraic solution of symmetric and Hermitian eigenvalue problems for dense matrices that have real-valued and complex-valued matrix entries, respectively, on parallel computer platforms. ELPA addresses standard as well as generalized eigenvalue problems, relying on the well documented matrix layout of the Scalable Linear Algebra PACKage (ScaLAPACK) library but replacing all actual parallel solution steps with subroutines of its own. For these steps, ELPA significantly outperforms the corresponding ScaLAPACK routines and proprietary libraries that implement the ScaLAPACK interface (e.g. Intel's MKL). The most time-critical step is the reduction of the matrix to tridiagonal form and the corresponding backtransformation of the eigenvectors. ELPA offers both a one-step tridiagonalization (successive Householder transformations) and a two-step transformation that is more efficient especially towards larger matrices and larger numbers of CPU cores. ELPA is based on the MPI standard, with an early hybrid MPI-OpenMPI implementation available as well. Scalability beyond 10,000 CPU cores for problem

  9. Teaching strategies applied to teaching computer networks in Engineering in Telecommunications and Electronics

    Directory of Open Access Journals (Sweden)

    Elio Manuel Castañeda-González

    2016-07-01

    Full Text Available Because of the large impact that today computer networks, their study in related fields such as Telecommunications Engineering and Electronics is presented to the student with great appeal. However, by digging in content, lacking a strong practical component, you can make this interest decreases considerably. This paper proposes the use of teaching strategies and analogies, media and interactive applications that enhance the teaching of discipline networks and encourage their study. It is part of an analysis of how the teaching of the discipline process is performed and then a description of each of these strategies is done with their respective contribution to student learning.

  10. Development of Computer-Based Training to Supplement Lessons in Fundamentals of Electronics

    Directory of Open Access Journals (Sweden)

    Ian P. Benitez

    2016-05-01

    Full Text Available Teaching Fundamentals of Electronics allow students to familiarize with basic electronics concepts, acquire skills in the use of multi-meter test instrument, and develop mastery in testing basic electronic components. Actual teaching and doing observations during practical activities on components pin identification and testing showed that the lack of skills of new students in testing components can lead to incorrect fault diagnosis and wrong pin connection during in-circuit replacement of the defective parts. With the aim of reinforcing students with concrete understanding of the concepts of components applied in the actual test and measurement, a Computer-Based Training was developed. The proponent developed the learning modules (courseware utilizing concept mapping and storyboarding instructional design. Developing a courseware as simulated, activity-based and interactive as possible was the primary goal to resemble the real-world process. A Local area network (LAN-based learning management system was also developed to use in administering the learning modules. The Paired Sample T-Test based on the pretest and post-test result was used to determine whether the students achieved learning after taking the courseware. The result revealed that there is a significant achievement of the students after studying the learning module. The E-learning content was validated by the instructors in terms of contents, activities, assessment and format with a grand weighted mean of 4.35 interpreted as Sufficient. Based from the evaluation result, supplementing with the proposed computer-based training can enhance the teachinglearning process in electronic fundamentals.

  11. Investigating the need for clinicians to use tablet computers with a newly envisioned electronic health record.

    Science.gov (United States)

    Saleem, Jason J; Savoy, April; Etherton, Gale; Herout, Jennifer

    2018-02-01

    The Veterans Health Administration (VHA) has deployed a large number of tablet computers in the last several years. However, little is known about how clinicians may use these devices with a newly planned Web-based electronic health record (EHR), as well as other clinical tools. The objective of this study was to understand the types of use that can be expected of tablet computers versus desktops. Semi-structured interviews were conducted with 24 clinicians at a Veterans Health Administration (VHA) Medical Center. An inductive qualitative analysis resulted in findings organized around recurrent themes of: (1) Barriers, (2) Facilitators, (3) Current Use, (4) Anticipated Use, (5) Patient Interaction, and (6) Connection. Our study generated several recommendations for the use of tablet computers with new health information technology tools being developed. Continuous connectivity for the mobile device is essential to avoid interruptions and clinician frustration. Also, making a physical keyboard available as an option for the tablet was a clear desire from the clinicians. Larger tablets (e.g., regular size iPad as compared to an iPad mini) were preferred. Being able to use secure messaging tools with the tablet computer was another consistent finding. Finally, more simplicity is needed for accessing patient data on mobile devices, while balancing the important need for adequate security. Published by Elsevier B.V.

  12. Device controllers using an industrial personal computer of the PF 2.5-GeV Electron Linac at KEK

    International Nuclear Information System (INIS)

    Otake, Yuji; Yokota, Mitsuhiro; Kakihara, Kazuhisa; Ogawa, Yujiro; Ohsawa, Satoshi; Shidara, Tetsuo; Nakahara, Kazuo

    1992-01-01

    Device controllers for electron guns and slits using an industrial personal computer have been designed and installed in the Photon Factory 2.5-GeV Electron Linac at KEK. The design concept of the controllers is to realize a reliable system and good productivity of hardware and software by using an industrial personal computer and a programmable sequence controller. The device controllers have been working reliably for several years. (author)

  13. XVI International symposium on nuclear electronics and VI International school on automation and computing in nuclear physics and astrophysics

    International Nuclear Information System (INIS)

    Churin, I.N.

    1995-01-01

    Reports and papers of the 16- International Symposium on nuclear electronics and the 6- International school on automation and computing in nuclear physics and astrophysics are presented. The latest achievements in the field of development of fact - response electronic circuits designed for detecting and spectrometric facilities are studied. The peculiar attention is paid to the systems for acquisition, processing and storage of experimental data. The modern equipment designed for data communication in the computer networks is studied

  14. Attractive evolutionary equilibria

    NARCIS (Netherlands)

    Joosten, Reinoud A.M.G.; Roorda, Berend

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary

  15. Data mining technique for a secure electronic payment transaction using MJk-RSA in mobile computing

    Science.gov (United States)

    G. V., Ramesh Babu; Narayana, G.; Sulaiman, A.; Padmavathamma, M.

    2012-04-01

    Due to the evolution of the Electronic Learning (E-Learning), one can easily get desired information on computer or mobile system connected through Internet. Currently E-Learning materials are easily accessible on the desktop computer system, but in future, most of the information shall also be available on small digital devices like Mobile, PDA, etc. Most of the E-Learning materials are paid and customer has to pay entire amount through credit/debit card system. Therefore, it is very important to study about the security of the credit/debit card numbers. The present paper is an attempt in this direction and a security technique is presented to secure the credit/debit card numbers supplied over the Internet to access the E-Learning materials or any kind of purchase through Internet. A well known method i.e. Data Cube Technique is used to design the security model of the credit/debit card system. The major objective of this paper is to design a practical electronic payment protocol which is the safest and most secured mode of transaction. This technique may reduce fake transactions which are above 20% at the global level.

  16. Clinical application of electron beam computed tomography in intravenous three-dimensional coronary angiography

    International Nuclear Information System (INIS)

    Luo Chufan; Du Zhimin; Hu Chengheng; Li Yi; Zeng Wutao; Ma Hong; Li Xiangmin; Zhou Xuhui

    2002-01-01

    Objective: To investigate the clinical application of intravenous three-dimensional coronary angiography using electron beam computed tomography (EBCT) as compared with selective coronary angiography. Methods: Intravenous EBCT and selective coronary angiography were performed during the same period in 38 patients. The value of EBCT angiography for diagnosing coronary artery disease was evaluated. Results: The number of coronary arteries adequately evaluated by EBCT angiography was 134 out of 152 vessels (88.2%), including 100% of the left main coronary arteries, 94.7% of the left anterior descending arteries, 81.6% of the left circumflex arteries and 76.3 % of the right coronary arteries. Significantly more left main and heft anterior descending coronary arteries were adequately visualized than the left circumflex and right coronary arteries (P < 0.05). The sensitivity, specificity, accuracy, and positive and negative predictive value of EBCT angiography for diagnosing coronary artery disease were 88.0%, 84.6%, 86.8%, 91.7% and 78.6%, respectively. Of the 38 arteries with ≥ 50% stenosis, EBCT underestimated 8, for a sensitivity of 78.9%. Of the 96 arteries without significant stenosis, EBCT overestimated 7 stenosis, for a specificity of 92.7%. Conclusion: Intravenous electron beam computed tomographic coronary angiography is a promising noninvasive method for diagnosing coronary artery disease

  17. A computational perspective of vibrational and electronic analysis of potential photosensitizer 2-chlorothioxanthone

    Science.gov (United States)

    Ali, Narmeen; Mansha, Asim; Asim, Sadia; Zahoor, Ameer Fawad; Ghafoor, Sidra; Akbar, Muhammad Usman

    2018-03-01

    This paper deals with combined theoretical and experimental study of geometric, electronic and vibrational properties of 2-chlorothioxanthone (CTX) molecule which is potential photosensitizer. The FT-IR spectrum of CTX in solid phase was recorded in 4000-400 cm-1 region. The UV-Vis. absorption spectrum was also recorded in the laboratory as well as computed at DFT/B3LYP level in five different phases viz. gas, water, DMSO, acetone and ethanol. The quantum mechanics based theoretical IR and Raman spectra were also calculated for the title compound employing HF and DFT functional with 3-21G+, 6-31G+ and 6-311G+, 6-311G++ basis sets, respectively, and assignment of each vibrational frequency has been done on the basis of potential energy distribution (PED). A comparison has been made between theoretical and experimental vibrational spectra as well as for the UV-Vis. absorption spectra. The computed infra red & Raman spectra by DFT compared with experimental spectra along with reliable vibrational assignment based on PED. The calculated electronic properties, results of natural bonding orbital (NBO) analysis, charge distribution, dipole moment and energies have been reported in the paper. Bimolecular quenching of triplet state of CTX in the presence of triethylamine, 2-propanol triethylamine and diazobicyclooctane (DABCO) reflect the interactions between them. The bimolecular quenching rate constant is fastest for interaction of 3CTX in the presence of DABCO reflecting their stronger interactions.

  18. E-commerce, paper and energy use: a case study concerning a Dutch electronic computer retailer

    Energy Technology Data Exchange (ETDEWEB)

    Hoogeveen, M.J.; Reijnders, L. [Open University Netherlands, Heerlen (Netherlands)

    2002-07-01

    Impacts of the application of c-commerce on paper and energy use are analysed in a case study concerning a Dutch electronic retailer (e-tailer) of computers. The estimated use of paper associated with the e-tailer concerned was substantially reduced if compared with physical retailing or traditional mail-order retailing. However, the overall effect of e-tailing on paper use strongly depends on customer behaviour. Some characteristics of c-commerce, as practised by the e-tailer concerned, such as diminished floor space requirements, reduced need for personal transport and simplified logistics, improve energy efficiency compared with physical retailing. Substitution of paper information by online information has an energetic effect that is dependent on the time of online information perusal and the extent to which downloaded information is printed. Increasing distances from producers to consumers, outsourcing, and increased use of computers, associated equipment and electronic networks are characteristics of e-commerce that may have an upward effect on energy use. In this case study, the upward effects thereof on energy use were less than the direct energy efficiency gains. However, the indirect effects associated with increased buying power and the rebound effect on transport following from freefalling travel time, greatly exceeded direct energy efficiency gains. (author)

  19. Evolutionary Stable Strategy

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 9. Evolutionary Stable Strategy: Application of Nash Equilibrium in Biology. General Article Volume 21 Issue 9 September 2016 pp 803- ... Keywords. Evolutionary game theory, evolutionary stable state, conflict, cooperation, biological games.

  20. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    Science.gov (United States)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  1. Algorithms for computing parsimonious evolutionary scenarios for genome evolution, the last universal common ancestor and dominance of horizontal gene transfer in the evolution of prokaryotes

    Directory of Open Access Journals (Sweden)

    Galperin Michael Y

    2003-01-01

    Full Text Available Abstract Background Comparative analysis of sequenced genomes reveals numerous instances of apparent horizontal gene transfer (HGT, at least in prokaryotes, and indicates that lineage-specific gene loss might have been even more common in evolution. This complicates the notion of a species tree, which needs to be re-interpreted as a prevailing evolutionary trend, rather than the full depiction of evolution, and makes reconstruction of ancestral genomes a non-trivial task. Results We addressed the problem of constructing parsimonious scenarios for individual sets of orthologous genes given a species tree. The orthologous sets were taken from the database of Clusters of Orthologous Groups of proteins (COGs. We show that the phyletic patterns (patterns of presence-absence in completely sequenced genomes of almost 90% of the COGs are inconsistent with the hypothetical species tree. Algorithms were developed to reconcile the phyletic patterns with the species tree by postulating gene loss, COG emergence and HGT (the latter two classes of events were collectively treated as gene gains. We prove that each of these algorithms produces a parsimonious evolutionary scenario, which can be represented as mapping of loss and gain events on the species tree. The distribution of the evolutionary events among the tree nodes substantially depends on the underlying assumptions of the reconciliation algorithm, e.g. whether or not independent gene gains (gain after loss after gain are permitted. Biological considerations suggest that, on average, gene loss might be a more likely event than gene gain. Therefore different gain penalties were used and the resulting series of reconstructed gene sets for the last universal common ancestor (LUCA of the extant life forms were analysed. The number of genes in the reconstructed LUCA gene sets grows as the gain penalty increases. However, qualitative examination of the LUCA versions reconstructed with different gain penalties

  2. Desiderata for computable representations of electronic health records-driven phenotype algorithms.

    Science.gov (United States)

    Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Denny, Joshua C; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A

    2015-11-01

    Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. © The Author 2015. Published by Oxford University Press on behalf of the American Medical

  3. Evolutionary molecular medicine.

    Science.gov (United States)

    Nesse, Randolph M; Ganten, Detlev; Gregory, T Ryan; Omenn, Gilbert S

    2012-05-01

    Evolution has long provided a foundation for population genetics, but some major advances in evolutionary biology from the twentieth century that provide foundations for evolutionary medicine are only now being applied in molecular medicine. They include the need for both proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, competition between alleles, co-evolution, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are transforming evolutionary biology in ways that create even more opportunities for progress at its interfaces with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and related principles to speed the development of evolutionary molecular medicine.

  4. Evolutionary Sound Synthesis Controlled by Gestural Data

    Directory of Open Access Journals (Sweden)

    Jose Fornari

    2011-05-01

    Full Text Available This article focuses on the interdisciplinary research involving Computer Music and Generative Visual Art. We describe the implementation of two interactive artistic systems based on principles of Gestural Data (WILSON, 2002 retrieval and self-organization (MORONI, 2003, to control an Evolutionary Sound Synthesis method (ESSynth. The first implementation uses, as gestural data, image mapping of handmade drawings. The second one uses gestural data from dynamic body movements of dance. The resulting computer output is generated by an interactive system implemented in Pure Data (PD. This system uses principles of Evolutionary Computation (EC, which yields the generation of a synthetic adaptive population of sound objects. Considering that music could be seen as “organized sound” the contribution of our study is to develop a system that aims to generate "self-organized sound" – a method that uses evolutionary computation to bridge between gesture, sound and music.

  5. Computer programs for unit-cell determination in electron diffraction experiments

    International Nuclear Information System (INIS)

    Li, X.Z.

    2005-01-01

    A set of computer programs for unit-cell determination from an electron diffraction tilt series and pattern indexing has been developed on the basis of several well-established algorithms. In this approach, a reduced direct primitive cell is first determined from experimental data, in the means time, the measurement errors of the tilt angles are checked and minimized. The derived primitive cell is then checked for possible higher lattice symmetry and transformed into a proper conventional cell. Finally a least-squares refinement procedure is adopted to generate optimum lattice parameters on the basis of the lengths of basic reflections in each diffraction pattern and the indices of these reflections. Examples are given to show the usage of the programs

  6. Recent Progress in First-Principles Methods for Computing the Electronic Structure of Correlated Materials

    Directory of Open Access Journals (Sweden)

    Fredrik Nilsson

    2018-03-01

    Full Text Available Substantial progress has been achieved in the last couple of decades in computing the electronic structure of correlated materials from first principles. This progress has been driven by parallel development in theory and numerical algorithms. Theoretical development in combining ab initio approaches and many-body methods is particularly promising. A crucial role is also played by a systematic method for deriving a low-energy model, which bridges the gap between real and model systems. In this article, an overview is given tracing the development from the LDA+U to the latest progress in combining the G W method and (extended dynamical mean-field theory ( G W +EDMFT. The emphasis is on conceptual and theoretical aspects rather than technical ones.

  7. Modeling of temperature profiles in an environmental transmission electron microscope using computational fluid dynamics

    DEFF Research Database (Denmark)

    Mortensen, Peter Mølgaard; Jensen, Anker Degn; Hansen, Thomas Willum

    2015-01-01

    The temperature and velocity field, pressure distribution, and the temperature variation across the sample region inside an environmental transmission electron microscope (ETEM) have been modeled by means of computational fluid dynamics (CFD). Heating the sample area by a furnace type TEM holder...... gives rise to temperature gradients over the sample area. Three major mechanisms have been identified with respect to heat transfer in the sample area: radiation from the grid, conduction in the grid, and conduction in the gas. A parameter sensitivity analysis showed that the sample temperature...... was affected by the conductivity of the gas, the emissivity of the sample grid, and the conductivity of the grid. Ideally the grid should be polished and made from a material with good conductivity, e.g. copper. With hydrogen gas, which has the highest conductivity of the gases studied, the temperature...

  8. Real-time data acquisition and computation for the SSC using optical and electronic technologies

    International Nuclear Information System (INIS)

    Cantrell, C.D.; Fenyves, E.J.; Wallace, B.

    1990-01-01

    The authors discuss combinations of optical and electronic technologies that may be able to address major data-filtering and data-analysis problems at the SSC. Novel scintillation detectors and optical readout may permit the use of optical processing techniques for trigger decisions and particle tracking. Very-high-speed fiberoptic local-area networks will be necessary to pipeline data from the detectors to the triggers and from the triggers to computers. High-speed, few-processor MIMD superconductors with advanced fiberoptic I/O technology offer a usable, cost-effective alternative to the microprocessor farms currently proposed for event selection and analysis for the SSC. The use of a real-time operating system that provides standard programming tools will facilitate all tasks, from reprogramming the detectors' event-selection criteria to detector simulation and event analysis. 34 refs., 1 fig., 1 tab

  9. Part 2 of the summary for the electronics, DAQ, and computing working group: Technological developments

    International Nuclear Information System (INIS)

    Slaughter, A.J.

    1993-01-01

    The attraction of hadron machines as B factories is the copious production of B particles. However, the interesting physics lies in specific rare final states. The challenge is selecting and recording the interesting ones. Part 1 of the summary for this working group, open-quote Comparison of Trigger and Data Acquisition Parameters for Future B Physics Experiments close-quote summarizes and compares the different proposals. In parallel with this activity, the working group also looked at a number of the technological developments being proposed to meet the trigger and DAQ requirements. The presentations covered a wide variety of topics, which are grouped into three categories: (1) front-end electronics, (2) level 0 fast triggers, and (3) trigger and vertex processors. The group did not discuss on-line farms or offine data storage and computing due to lack of time

  10. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    International Nuclear Information System (INIS)

    Frankel, R.S.

    1995-01-01

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation

  11. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  12. COMPUTATIONAL EFFICIENCY OF A MODIFIED SCATTERING KERNEL FOR FULL-COUPLED PHOTON-ELECTRON TRANSPORT PARALLEL COMPUTING WITH UNSTRUCTURED TETRAHEDRAL MESHES

    Directory of Open Access Journals (Sweden)

    JONG WOON KIM

    2014-04-01

    In this paper, we introduce a modified scattering kernel approach to avoid the unnecessarily repeated calculations involved with the scattering source calculation, and used it with parallel computing to effectively reduce the computation time. Its computational efficiency was tested for three-dimensional full-coupled photon-electron transport problems using our computer program which solves the multi-group discrete ordinates transport equation by using the discontinuous finite element method with unstructured tetrahedral meshes for complicated geometrical problems. The numerical tests show that we can improve speed up to 17∼42 times for the elapsed time per iteration using the modified scattering kernel, not only in the single CPU calculation but also in the parallel computing with several CPUs.

  13. Electronics, trigger, data acquisition, and computing working group on future B physics experiments

    International Nuclear Information System (INIS)

    Geer, S.

    1993-01-01

    Electronics, trigger, data acquisition, and computing: this is a very broad list of topics. Nevertheless in a modern particle physics experiment one thinks in terms of a data pipeline in which the front end electronics, the trigger and data acquisition, and the offline reconstruction are linked together. In designing any piece of this pipeline it is necessary to understand the bigger picture of the data flow, data rates and volume, and the input rate, output rate, and latencies for each part of the pipeline. All of this needs to be developed with a clear understanding of the requirements imposed by the physics goals of the experiment; the signal efficiencies, background rates, and the amount of recorded information that needs to be propagated through the pipeline to select and analyse the events of interest. The technology needed to meet the demanding high data volume needs of the next round of B physics experiments appears to be available, now or within a couple of years. This seems to be the case for both fixed target and collider B physics experiments. Although there are many differences between the various data pipelines that are being proposed, there are also striking similarities. All experiments have a multi-level trigger scheme (most have levels 1, 2, and 3) where the final level consists of a computing farm that can run offline-type code and reduce the data volume by a factor of a few. Finally, the ability to reconstruct large data volumes offline in a reasonably short time, and making large data volumes available to many physicists for analysis, imposes severe constraints on the foreseen data pipelines, and a significant uncertainty in evaluating the various approaches proposed

  14. Avoiding Local Optima with Interactive Evolutionary Robotics

    Science.gov (United States)

    2012-07-09

    the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the

  15. A computationally efficient moment-preserving Monte Carlo electron transport method with implementation in Geant4

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, D.A., E-mail: ddixon@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, MS P365, Los Alamos, NM 87545 (United States); Prinja, A.K., E-mail: prinja@unm.edu [Department of Nuclear Engineering, MSC01 1120, 1 University of New Mexico, Albuquerque, NM 87131-0001 (United States); Franke, B.C., E-mail: bcfrank@sandia.gov [Sandia National Laboratories, Albuquerque, NM 87123 (United States)

    2015-09-15

    This paper presents the theoretical development and numerical demonstration of a moment-preserving Monte Carlo electron transport method. Foremost, a full implementation of the moment-preserving (MP) method within the Geant4 particle simulation toolkit is demonstrated. Beyond implementation details, it is shown that the MP method is a viable alternative to the condensed history (CH) method for inclusion in current and future generation transport codes through demonstration of the key features of the method including: systematically controllable accuracy, computational efficiency, mathematical robustness, and versatility. A wide variety of results common to electron transport are presented illustrating the key features of the MP method. In particular, it is possible to achieve accuracy that is statistically indistinguishable from analog Monte Carlo, while remaining up to three orders of magnitude more efficient than analog Monte Carlo simulations. Finally, it is shown that the MP method can be generalized to any applicable analog scattering DCS model by extending previous work on the MP method beyond analytical DCSs to the partial-wave (PW) elastic tabulated DCS data.

  16. Electromagnetic computer simulations of collective ion acceleration by a relativistic electron beam

    International Nuclear Information System (INIS)

    Galvez, M.; Gisler, G.R.

    1988-01-01

    A 2.5 electromagnetic particle-in-cell computer code is used to study the collective ion acceleration when a relativistic electron beam is injected into a drift tube partially filled with cold neutral plasma. The simulations of this system reveals that the ions are subject to electrostatic acceleration by an electrostatic potential that forms behind the head of the beam. This electrostatic potential develops soon after the beam is injected into the drift tube, drifts with the beam, and eventually settles to a fixed position. At later times, this electrostatic potential becomes a virtual cathode. When the permanent position of the electrostatic potential is at the edge of the plasma or further up, then ions are accelerated forward and a unidirectional ion flow is obtained otherwise a bidirectional ion flow occurs. The ions that achieve higher energy are those which drift with the negative potential. When the plasma density is varied, the simulations show that optimum acceleration occurs when the density ratio between the beam (n b ) and the plasma (n o ) is unity. Simulations were carried out by changing the ion mass. The results of these simulations corroborate the hypothesis that the ion acceleration mechanism is purely electrostatic, so that the ion acceleration depends inversely on the charge particle mass. The simulations also show that the ion maximum energy increased logarithmically with the electron beam energy and proportional with the beam current

  17. Meeting the security requirements of electronic medical records in the ERA of high-speed computing.

    Science.gov (United States)

    Alanazi, H O; Zaidan, A A; Zaidan, B B; Kiah, M L Mat; Al-Bakri, S H

    2015-01-01

    This study has two objectives. First, it aims to develop a system with a highly secured approach to transmitting electronic medical records (EMRs), and second, it aims to identify entities that transmit private patient information without permission. The NTRU and the Advanced Encryption Standard (AES) cryptosystems are secured encryption methods. The AES is a tested technology that has already been utilized in several systems to secure sensitive data. The United States government has been using AES since June 2003 to protect sensitive and essential information. Meanwhile, NTRU protects sensitive data against attacks through the use of quantum computers, which can break the RSA cryptosystem and elliptic curve cryptography algorithms. A hybrid of AES and NTRU is developed in this work to improve EMR security. The proposed hybrid cryptography technique is implemented to secure the data transmission process of EMRs. The proposed security solution can provide protection for over 40 years and is resistant to quantum computers. Moreover, the technique provides the necessary evidence required by law to identify disclosure or misuse of patient records. The proposed solution can effectively secure EMR transmission and protect patient rights. It also identifies the source responsible for disclosing confidential patient records. The proposed hybrid technique for securing data managed by institutional websites must be improved in the future.

  18. The Bravyi-Kitaev transformation for quantum computation of electronic structure

    Science.gov (United States)

    Seeley, Jacob T.; Richard, Martin J.; Love, Peter J.

    2012-12-01

    Quantum simulation is an important application of future quantum computers with applications in quantum chemistry, condensed matter, and beyond. Quantum simulation of fermionic systems presents a specific challenge. The Jordan-Wigner transformation allows for representation of a fermionic operator by O(n) qubit operations. Here, we develop an alternative method of simulating fermions with qubits, first proposed by Bravyi and Kitaev [Ann. Phys. 298, 210 (2002), 10.1006/aphy.2002.6254; e-print arXiv:quant-ph/0003137v2], that reduces the simulation cost to O(log n) qubit operations for one fermionic operation. We apply this new Bravyi-Kitaev transformation to the task of simulating quantum chemical Hamiltonians, and give a detailed example for the simplest possible case of molecular hydrogen in a minimal basis. We show that the quantum circuit for simulating a single Trotter time step of the Bravyi-Kitaev derived Hamiltonian for H2 requires fewer gate applications than the equivalent circuit derived from the Jordan-Wigner transformation. Since the scaling of the Bravyi-Kitaev method is asymptotically better than the Jordan-Wigner method, this result for molecular hydrogen in a minimal basis demonstrates the superior efficiency of the Bravyi-Kitaev method for all quantum computations of electronic structure.

  19. Modeling of temperature profiles in an environmental transmission electron microscope using computational fluid dynamics

    International Nuclear Information System (INIS)

    Mølgaard Mortensen, Peter; Willum Hansen, Thomas; Birkedal Wagner, Jakob; Degn Jensen, Anker

    2015-01-01

    The temperature and velocity field, pressure distribution, and the temperature variation across the sample region inside an environmental transmission electron microscope (ETEM) have been modeled by means of computational fluid dynamics (CFD). Heating the sample area by a furnace type TEM holder gives rise to temperature gradients over the sample area. Three major mechanisms have been identified with respect to heat transfer in the sample area: radiation from the grid, conduction in the grid, and conduction in the gas. A parameter sensitivity analysis showed that the sample temperature was affected by the conductivity of the gas, the emissivity of the sample grid, and the conductivity of the grid. Ideally the grid should be polished and made from a material with good conductivity, e.g. copper. With hydrogen gas, which has the highest conductivity of the gases studied, the temperature difference over the TEM grid is less than 5 °C, at what must be considered typical conditions, and it is concluded that the conditions on the sample grid in the ETEM can be considered as isothermal during general use. - Highlights: • Computational fluid dynamics used for mapping flow and temperature in ETEM setup. • Temperature gradient across TEM grid in furnace based heating holder very small in ETEM. • Conduction from TEM grid and gas in addition to radiation from TEM grid most important. • Pressure drop in ETEM limited to the pressure limiting apertures

  20. Electron beam diagnostic system using computed tomography and an annular sensor

    Science.gov (United States)

    Elmer, John W.; Teruya, Alan T.

    2014-07-29

    A system for analyzing an electron beam including a circular electron beam diagnostic sensor adapted to receive the electron beam, the circular electron beam diagnostic sensor having a central axis; an annular sensor structure operatively connected to the circular electron beam diagnostic sensor, wherein the sensor structure receives the electron beam; a system for sweeping the electron beam radially outward from the central axis of the circular electron beam diagnostic sensor to the annular sensor structure wherein the electron beam is intercepted by the annular sensor structure; and a device for measuring the electron beam that is intercepted by the annular sensor structure.

  1. Computed tomography as a source of electron density information for radiation treatment planning

    International Nuclear Information System (INIS)

    Skrzynski, Witold; Slusarczyk-Kacprzyk, Wioletta; Bulski, Wojciech; Zielinska-Dabrowska, Sylwia; Wachowicz, Marta; Kukolowicz, Pawel F.

    2010-01-01

    Purpose: to evaluate the performance of computed tomography (CT) systems of various designs as a source of electron density (ρ el ) data for treatment planning of radiation therapy. Material and methods: dependence of CT numbers on relative electron density of tissue-equivalent materials (HU-ρ el relationship) was measured for several general-purpose CT systems (single-slice, multislice, wide-bore multislice), for radiotherapy simulators with a single-slice CT and kV CBCT (cone-beam CT) options, as well as for linear accelerators with kV and MV CBCT systems. Electron density phantoms of four sizes were used. Measurement data were compared with the standard HU-ρ el relationships predefined in two commercial treatment-planning systems (TPS). Results: the HU-ρ el relationships obtained with all of the general-purpose CT scanners operating at voltages close to 120 kV were very similar to each other and close to those predefined in TPS. Some dependency of HU values on tube voltage was observed for bone-equivalent materials. For a given tube voltage, differences in results obtained for different phantoms were larger than those obtained for different CT scanners. For radiotherapy simulators and for kV CBCT systems, the information on ρ el was much less precise because of poor uniformity of images. For MV CBCT, the results were significantly different than for kV systems due to the differing energy spectrum of the beam. Conclusion: the HU-ρ el relationships predefined in TPS can be used for general-purpose CT systems operating at voltages close to 120 kV. For nontypical imaging systems (e.g., CBCT), the relationship can be significantly different and, therefore, it should always be measured and carefully analyzed before using CT data for treatment planning. (orig.)

  2. The accuracy of molecular bond lengths computed by multireference electronic structure methods

    International Nuclear Information System (INIS)

    Shepard, Ron; Kedziora, Gary S.; Lischka, Hans; Shavitt, Isaiah; Mueller, Thomas; Szalay, Peter G.; Kallay, Mihaly; Seth, Michael

    2008-01-01

    We compare experimental R e values with computed R e values for 20 molecules using three multireference electronic structure methods, MCSCF, MR-SDCI, and MR-AQCC. Three correlation-consistent orbital basis sets are used, along with complete basis set extrapolations, for all of the molecules. These data complement those computed previously with single-reference methods. Several trends are observed. The SCF R e values tend to be shorter than the experimental values, and the MCSCF values tend to be longer than the experimental values. We attribute these trends to the ionic contamination of the SCF wave function and to the corresponding systematic distortion of the potential energy curve. For the individual bonds, the MR-SDCI R e values tend to be shorter than the MR-AQCC values, which in turn tend to be shorter than the MCSCF values. Compared to the previous single-reference results, the MCSCF values are roughly comparable to the MP4 and CCSD methods, which are more accurate than might be expected due to the fact that these MCSCF wave functions include no extra-valence electron correlation effects. This suggests that static valence correlation effects, such as near-degeneracies and the ability to dissociate correctly to neutral fragments, play an important role in determining the shape of the potential energy surface, even near equilibrium structures. The MR-SDCI and MR-AQCC methods predict R e values with an accuracy comparable to, or better than, the best single-reference methods (MP4, CCSD, and CCSD(T)), despite the fact that triple and higher excitations into the extra-valence orbital space are included in the single-reference methods but are absent in the multireference wave functions. The computed R e values using the multireference methods tend to be smooth and monotonic with basis set improvement. The molecular structures are optimized using analytic energy gradients, and the timings for these calculations show the practical advantage of using variational wave

  3. The accuracy of molecular bond lengths computed by multireference electronic structure methods

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, Ron [Chemical Sciences and Engineering Division, Argonne National Laboratory, Argonne, IL 60439 (United States)], E-mail: shepard@tcg.anl.gov; Kedziora, Gary S. [High Performance Technologies Inc., 2435 5th Street, WPAFB, OH 45433 (United States); Lischka, Hans [Institute for Theoretical Chemistry, University of Vienna, Waehringerstrasse 17, A-1090 Vienna (Austria); Shavitt, Isaiah [Department of Chemistry, University of Illinois, 600 S. Mathews Avenue, Urbana, IL 61801 (United States); Mueller, Thomas [Juelich Supercomputer Centre, Research Centre Juelich, D-52425 Juelich (Germany); Szalay, Peter G. [Laboratory for Theoretical Chemistry, Institute of Chemistry, Eoetvoes Lorand University, P.O. Box 32, H-1518 Budapest (Hungary); Kallay, Mihaly [Department of Physical Chemistry and Materials Science, Budapest University of Technology and Economics, P.O. Box 91, H-1521 Budapest (Hungary); Seth, Michael [Department of Chemistry, University of Calgary, 2500 University Drive, N.W., Calgary, Alberta, T2N 1N4 (Canada)

    2008-06-16

    We compare experimental R{sub e} values with computed R{sub e} values for 20 molecules using three multireference electronic structure methods, MCSCF, MR-SDCI, and MR-AQCC. Three correlation-consistent orbital basis sets are used, along with complete basis set extrapolations, for all of the molecules. These data complement those computed previously with single-reference methods. Several trends are observed. The SCF R{sub e} values tend to be shorter than the experimental values, and the MCSCF values tend to be longer than the experimental values. We attribute these trends to the ionic contamination of the SCF wave function and to the corresponding systematic distortion of the potential energy curve. For the individual bonds, the MR-SDCI R{sub e} values tend to be shorter than the MR-AQCC values, which in turn tend to be shorter than the MCSCF values. Compared to the previous single-reference results, the MCSCF values are roughly comparable to the MP4 and CCSD methods, which are more accurate than might be expected due to the fact that these MCSCF wave functions include no extra-valence electron correlation effects. This suggests that static valence correlation effects, such as near-degeneracies and the ability to dissociate correctly to neutral fragments, play an important role in determining the shape of the potential energy surface, even near equilibrium structures. The MR-SDCI and MR-AQCC methods predict R{sub e} values with an accuracy comparable to, or better than, the best single-reference methods (MP4, CCSD, and CCSD(T)), despite the fact that triple and higher excitations into the extra-valence orbital space are included in the single-reference methods but are absent in the multireference wave functions. The computed R{sub e} values using the multireference methods tend to be smooth and monotonic with basis set improvement. The molecular structures are optimized using analytic energy gradients, and the timings for these calculations show the practical

  4. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records.

    Science.gov (United States)

    Duz, Marco; Marshall, John F; Parkin, Tim

    2017-06-29

    The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free

  5. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  6. Remembering the evolutionary Freud.

    Science.gov (United States)

    Young, Allan

    2006-03-01

    Throughout his career as a writer, Sigmund Freud maintained an interest in the evolutionary origins of the human mind and its neurotic and psychotic disorders. In common with many writers then and now, he believed that the evolutionary past is conserved in the mind and the brain. Today the "evolutionary Freud" is nearly forgotten. Even among Freudians, he is regarded to be a red herring, relevant only to the extent that he diverts attention from the enduring achievements of the authentic Freud. There are three ways to explain these attitudes. First, the evolutionary Freud's key work is the "Overview of the Transference Neurosis" (1915). But it was published at an inopportune moment, forty years after the author's death, during the so-called "Freud wars." Second, Freud eventually lost interest in the "Overview" and the prospect of a comprehensive evolutionary theory of psychopathology. The publication of The Ego and the Id (1923), introducing Freud's structural theory of the psyche, marked the point of no return. Finally, Freud's evolutionary theory is simply not credible. It is based on just-so stories and a thoroughly discredited evolutionary mechanism, Lamarckian use-inheritance. Explanations one and two are probably correct but also uninteresting. Explanation number three assumes that there is a fundamental difference between Freud's evolutionary narratives (not credible) and the evolutionary accounts of psychopathology that currently circulate in psychiatry and mainstream journals (credible). The assumption is mistaken but worth investigating.

  7. Electronic Structure of the Perylene / Zinc Oxide Interface: A Computational Study of Photoinduced Electron Transfer and Impact of Surface Defects

    KAUST Repository

    Li, Jingrui

    2015-07-29

    The electronic properties of dye-sensitized semiconductor surfaces consisting of pery- lene chromophores chemisorbed on zinc oxide via different spacer-anchor groups, have been studied at the density-functional-theory level. The energy distributions of the donor states and the rates of photoinduced electron transfer from dye to surface are predicted. We evaluate in particular the impact of saturated versus unsaturated aliphatic spacer groups inserted between the perylene chromophore and the semiconductor as well as the influence of surface defects on the electron-injection rates.

  8. Electronic Structure of the Perylene / Zinc Oxide Interface: A Computational Study of Photoinduced Electron Transfer and Impact of Surface Defects

    KAUST Repository

    Li, Jingrui; Li, Hong; Winget, Paul; Bredas, Jean-Luc

    2015-01-01

    The electronic properties of dye-sensitized semiconductor surfaces consisting of pery- lene chromophores chemisorbed on zinc oxide via different spacer-anchor groups, have been studied at the density-functional-theory level. The energy distributions of the donor states and the rates of photoinduced electron transfer from dye to surface are predicted. We evaluate in particular the impact of saturated versus unsaturated aliphatic spacer groups inserted between the perylene chromophore and the semiconductor as well as the influence of surface defects on the electron-injection rates.

  9. 78 FR 1247 - Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media...

    Science.gov (United States)

    2013-01-08

    ... Wireless Communication Devices, Tablet Computers, Media Players, and Televisions, and Components Thereof... devices, including wireless communication devices, tablet computers, media players, and televisions, and... wireless communication devices, tablet computers, media players, and televisions, and components thereof...

  10. Phylogenetic inference with weighted codon evolutionary distances.

    Science.gov (United States)

    Criscuolo, Alexis; Michel, Christian J

    2009-04-01

    We develop a new approach to estimate a matrix of pairwise evolutionary distances from a codon-based alignment based on a codon evolutionary model. The method first computes a standard distance matrix for each of the three codon positions. Then these three distance matrices are weighted according to an estimate of the global evolutionary rate of each codon position and averaged into a unique distance matrix. Using a large set of both real and simulated codon-based alignments of nucleotide sequences, we show that this approach leads to distance matrices that have a significantly better treelikeness compared to those obtained by standard nucleotide evolutionary distances. We also propose an alternative weighting to eliminate the part of the noise often associated with some codon positions, particularly the third position, which is known to induce a fast evolutionary rate. Simulation results show that fast distance-based tree reconstruction algorithms on distance matrices based on this codon position weighting can lead to phylogenetic trees that are at least as accurate as, if not better, than those inferred by maximum likelihood. Finally, a well-known multigene dataset composed of eight yeast species and 106 codon-based alignments is reanalyzed and shows that our codon evolutionary distances allow building a phylogenetic tree which is similar to those obtained by non-distance-based methods (e.g., maximum parsimony and maximum likelihood) and also significantly improved compared to standard nucleotide evolutionary distance estimates.

  11. Electronic conductivity of solid and liquid (Mg, Fe)O computed from first principles

    Science.gov (United States)

    Holmström, E.; Stixrude, L.; Scipioni, R.; Foster, A. S.

    2018-05-01

    Ferropericlase (Mg, Fe)O is an abundant mineral of Earth's lower mantle and the liquid phase of the material was an important component of the early magma ocean. Using quantum-mechanical, finite-temperature density-functional theory calculations, we compute the electronic component of the electrical and thermal conductivity of (Mg0.75, Fe0.25)O crystal and liquid over a wide range of planetary conditions: 0-200 GPa, 2000-4000 K for the crystal, and 0-300 GPa, 4000-10,000 K for the liquid. We find that the crystal and liquid are semi-metallic over the entire range studied: the crystal has an electrical conductivity exceeding 103 S/m, whereas that of the liquid exceeds 104 S/m. Our results on the crystal are in reasonable agreement with experimental measurements of the electrical conductivity of ferropericlase once we account for the dependence of conductivity on iron content. We find that a harzburgite-dominated mantle with ferropericlase in combination with Al-free bridgmanite agrees well with electromagnetic sounding observations, while a pyrolitic mantle with a ferric-iron rich bridgmanite composition yields a lower mantle that is too conductive. The electronic component of thermal conductivity of ferropericlase with XFe = 0.19 is negligible (accounts for the high conductance that has been proposed to explain anomalies in Earth's nutation. The electrical conductivity of liquid ferropericlase exceeds that of liquid silica by more than an order of magnitude at conditions of a putative basal magma ocean, thus strengthening arguments that the basal magma ocean could have produced an ancient dynamo.

  12. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  13. Diagnostic value of electron-beam computed tomography (EBT). I. cardiac applications

    International Nuclear Information System (INIS)

    Enzweiler, C.N.H.; Lembcke, A.; Rogalla, P.; Taupitz, M.; Wiese, T.H.; Hammm, B.; Becker, C.R.; Reiser, M.F.; Felix, R.; Knollmann, F.D.; Georgi, M.; Weisser, G.; Lehmann, K.J.

    2004-01-01

    Electron beam tomography (EBT) directly competes with other non-invasive imaging modalities, such as multislice computed tomography, magnetic resonance imaging, and echocardiography, in the diagnostic assessment of cardiac diseases. EBT is the gold standard for the detection and quantification of coronary calcium as a preclinical sign of coronary artery disease (CAD). Its standardized examination protocols and the broad experience with this method favor EBT. First results with multislice CT indicate that this new technology may be equivalent to EBT for coronary calcium studies. The principal value of CT-based coronary calcium measurements continues to be an issue of controversy amongst radiologists and cardiologists due to lack of prospective randomized trials. Coronary angiography with EBT is characterized by a high negative predictive value and, in addition, may be indicated in some patients with manifest CAD. It remains to be shown whether coronary angiography with multislice CT is reliable and accurate enough to be introduced into the routine work-up, to replace some of the many strictly diagnostic coronary catheterizations in Germany and elsewhere. Assessment of coronary stent patency with EBT is associated with several problems and in our opinion cannot be advocated as a routine procedure. EBT may be recommended for the evaluation of coronary bypasses to look for bypass occlusions and significant stenoses, which, however, can be equally well achieved with multislice CT. Quantification of myocardial perfusion with EBT could not replace MRI or other modalities in this field. EBT has proven to be accurate, reliable and in some instances equivalent to MRI, which is the gold standard for the quantitative and qualitative evaluation of cardiac function. Some disadvantages not the least of which is the limited distribution of electron beam scanners, favor MRI for functional assessment of the heart. (orig.) [de

  14. A Computationally-Efficient, Multi-Mechanism Based Framework for the Comprehensive Modeling of the Evolutionary Behavior of Shape Memory Alloys

    Science.gov (United States)

    Saleeb, Atef F.; Vaidyanathan, Raj

    2016-01-01

    The report summarizes the accomplishments made during the 4-year duration of the project. Here, the major emphasis is placed on the different tasks performed by the two research teams; i.e., the modeling activities by the University of Akron (UA) team and the experimental and neutron diffraction studies conducted by the University of Central Florida (UCF) team, during this 4-year period. Further technical details are given in the upcoming sections by UA and UCF for each of the milestones/years (together with the corresponding figures and captions).The project majorly involved the development, validation, and application of a general theoretical model that is capable of capturing the nonlinear hysteretic responses, including pseudoelasticity, shape memory effect, rate-dependency, multi-axiality, asymmetry in tension versus compression response of shape memory alloys. Among the targeted goals for the SMA model was its ability to account for the evolutionary character response (including transient and long term behavior under sustained cycles) for both conventional and high temperature (HT) SMAs, as well as being able to simulate some of the devices which exploit these unique material systems. This required extensive (uniaxial and multi-axial) experiments needed to guide us in calibrating and characterizing the model. Moreover, since the model is formulated on the theoretical notion of internal state variables (ISVs), neutron diffraction experiments were needed to establish the linkage between the micromechanical changes and these ISVs. In addition, the design of the model should allow easy implementation in large scale finite element application to study the behavior of devices making use of these SMA materials under different loading controls. Summary of the activities, progress/achievements made during this period is given below in details for the University of Akron and the University (Section 2.0) of Central Florida (Section 3.0).

  15. Attractive evolutionary equilibria

    OpenAIRE

    Roorda, Berend; Joosten, Reinoud

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary dynamics. For instance, each attractive evolutionarily stable strategy is an attractive evolutionarily stable equilibrium for certain barycentric ray-projection dynamics, and vice versa.

  16. Evolutionary Robotics: What, Why, and Where to

    Directory of Open Access Journals (Sweden)

    Stephane eDoncieux

    2015-03-01

    Full Text Available Evolutionary robotics applies the selection, variation, and heredity principles of natural evolution to the design of robots with embodied intelligence. It can be considered as a subfield of robotics that aims to create more robust and adaptive robots. A pivotal feature of the evolutionary approach is that it considers the whole robot at once, and enables the exploitation of robot features in a holistic manner. Evolutionary robotics can also be seen as an innovative approach to the study of evolution based on a new kind of experimentalism. The use of robots as a substrate can help address questions that are difficult, if not impossible, to investigate through computer simulations or biological studies. In this paper we consider the main achievements of evolutionary robotics, focusing particularly on its contributions to both engineering and biology. We briefly elaborate on methodological issues, review some of the most interesting findings, and discuss important open issues and promising avenues for future work.

  17. Mean-Potential Law in Evolutionary Games

    Science.gov (United States)

    Nałecz-Jawecki, Paweł; Miekisz, Jacek

    2018-01-01

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  18. Feasibility of replacing patient specific cutouts with a computer-controlled electron multileaf collimator

    International Nuclear Information System (INIS)

    Eldib, Ahmed; Jin Lihui; Li Jinsheng; Ma, C-M Charlie

    2013-01-01

    A motorized electron multileaf collimator (eMLC) was developed as an add-on device to the Varian linac for delivery of advanced electron beam therapy. It has previously been shown that electron beams collimated by an eMLC have very similar penumbra to those collimated by applicators and cutouts. Thus, manufacturing patient specific cutouts would no longer be necessary, resulting in the reduction of time taken in the cutout fabrication process. Moreover, cutout construction involves handling of toxic materials and exposure to toxic fumes that are usually generated during the process, while the eMLC will be a pollution-free device. However, undulation of the isodose lines is expected due to the finite size of the eMLC. Hence, the provided planned target volume (PTV) shape will not exactly follow the beam's-eye-view of the PTV, but instead will make a stepped approximation to the PTV shape. This may be a problem when the field edge is close to a critical structure. Therefore, in this study the capability of the eMLC to achieve the same clinical outcome as an applicator/cutout combination was investigated based on real patient computed tomographies (CTs). An in-house Monte Carlo based treatment planning system was used for dose calculation using ten patient CTs. For each patient, two plans were generated; one with electron beams collimated using the applicator/cutout combination; and the other plan with beams collimated by the eMLC. Treatment plan quality was compared for each patient based on dose distribution and dose–volume histogram. In order to determine the optimal position of the leaves, the impact of the different leaf positioning strategies was investigated. All plans with both eMLC and cutouts were generated such that 100% of the target volume receives at least 90% of the prescribed dose. Then the percentage difference in dose between both delivery techniques was calculated for all the cases. The difference in the dose received by 10% of the volume of the

  19. Feasibility of replacing patient specific cutouts with a computer-controlled electron multileaf collimator

    Science.gov (United States)

    Eldib, Ahmed; Jin, Lihui; Li, Jinsheng; Ma, C.-M. Charlie

    2013-08-01

    A motorized electron multileaf collimator (eMLC) was developed as an add-on device to the Varian linac for delivery of advanced electron beam therapy. It has previously been shown that electron beams collimated by an eMLC have very similar penumbra to those collimated by applicators and cutouts. Thus, manufacturing patient specific cutouts would no longer be necessary, resulting in the reduction of time taken in the cutout fabrication process. Moreover, cutout construction involves handling of toxic materials and exposure to toxic fumes that are usually generated during the process, while the eMLC will be a pollution-free device. However, undulation of the isodose lines is expected due to the finite size of the eMLC. Hence, the provided planned target volume (PTV) shape will not exactly follow the beam's-eye-view of the PTV, but instead will make a stepped approximation to the PTV shape. This may be a problem when the field edge is close to a critical structure. Therefore, in this study the capability of the eMLC to achieve the same clinical outcome as an applicator/cutout combination was investigated based on real patient computed tomographies (CTs). An in-house Monte Carlo based treatment planning system was used for dose calculation using ten patient CTs. For each patient, two plans were generated; one with electron beams collimated using the applicator/cutout combination; and the other plan with beams collimated by the eMLC. Treatment plan quality was compared for each patient based on dose distribution and dose-volume histogram. In order to determine the optimal position of the leaves, the impact of the different leaf positioning strategies was investigated. All plans with both eMLC and cutouts were generated such that 100% of the target volume receives at least 90% of the prescribed dose. Then the percentage difference in dose between both delivery techniques was calculated for all the cases. The difference in the dose received by 10% of the volume of the

  20. Stretchable, Twisted Conductive Microtubules for Wearable Computing, Robotics, Electronics, and Healthcare

    OpenAIRE

    Thanh Nho Do; Yon Visell

    2017-01-01

    Stretchable and flexible multifunctional electronic components, including sensors and actuators, have received increasing attention in robotics, electronics, wearable, and healthcare applications. Despite advances, it has remained challenging to design analogs of many electronic components to be highly stretchable, to be efficient to fabricate, and to provide control over electronic performance. Here, we describe highly elastic sensors and interconnects formed from thin, twisted conductive mi...

  1. Electronic computer prediction of properties of binary refractory transition metal compounds on the base of their simplificated electronic structure

    International Nuclear Information System (INIS)

    Kutolin, S.A.; Kotyukov, V.I.

    1979-01-01

    An attempt is made to obtain calculation equations of macroscopic physico-chemical properties of transition metal refractory compounds (density, melting temperature, Debye characteristic temperature, microhardness, standard formation enthalpy, thermo-emf) using the method of the regression analysis. Apart from the compound composition the argument of the regression equation is the distribution of electron bands of d-transition metals, created by the energy electron distribution in the simplified zone structure of transition metals and approximated by Chebishev polynoms, by the position of Fermi energy on the map of distribution of electron band energy depending upon the value of quasi-impulse, multiple to the first, second and third Brillouin zone for transition metals. The maximum relative error of the regressions obtained as compared with the literary data is 15-20 rel.%

  2. Polymorphic Evolutionary Games.

    Science.gov (United States)

    Fishman, Michael A

    2016-06-07

    In this paper, I present an analytical framework for polymorphic evolutionary games suitable for explicitly modeling evolutionary processes in diploid populations with sexual reproduction. The principal aspect of the proposed approach is adding diploid genetics cum sexual recombination to a traditional evolutionary game, and switching from phenotypes to haplotypes as the new game׳s pure strategies. Here, the relevant pure strategy׳s payoffs derived by summing the payoffs of all the phenotypes capable of producing gametes containing that particular haplotype weighted by the pertinent probabilities. The resulting game is structurally identical to the familiar Evolutionary Games with non-linear pure strategy payoffs (Hofbauer and Sigmund, 1998. Cambridge University Press), and can be analyzed in terms of an established analytical framework for such games. And these results can be translated into the terms of genotypic, and whence, phenotypic evolutionary stability pertinent to the original game. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Fabrication of a novel silicon single electron transistor for Si:P quantum computer devices

    International Nuclear Information System (INIS)

    Angus, S.J.; Smith, C.E.A.; Gauja, E.; Dzurak, A.S.; Clark, R.G.; Snider, G.L.

    2004-01-01

    Full text: Quantum computation relies on the successful measurement of quantum states. Single electron transistors (SETs) are known to be able to perform fast and sensitive charge measurements of solid state qubits. However, due to their sensitivity, SETs are also very susceptible to random charge fluctuations in a solid-state materials environment. In previous dc transport measurements, silicon-based SETs have demonstrated greater charge stability than A1/A1 2 O 3 SETs. We have designed and fabricated a novel silicon SET architecture for a comparison of the noise characteristics of silicon and aluminium based devices. The silicon SET described here is designed for controllable and reproducible low temperature operation. It is fabricated using a novel dual gate structure on a silicon-on-insulator substrate. A silicon quantum wire is formed in a 100nm thick high-resistivity superficial silicon layer using reactive ion etching. Carriers are induced in the silicon wire by a back gate in the silicon substrate. The tunnel barriers are created electrostatically, using lithographically defined metallic electrodes (∼40nm width). These tunnel barriers surround the surface of the quantum wire, thus producing excellent electrostatic confinement. This architecture provides independent control of tunnel barrier height and island occupancy, thus promising better control of Coulomb blockade oscillations than in previously investigated silicon SETs. The use of a near intrinsic silicon substrate offers compatibility with Si:P qubits in the longer term

  4. Electron-beam computed tomography findings of left atrial appendage in patients with cardiogenic cerebral embolism

    Energy Technology Data Exchange (ETDEWEB)

    Okamoto, Makiko; Takahashi, Satoshi; Yonezawa, Hisashi [Iwate Medical Univ., Morioka (Japan). School of Medicine

    2002-04-01

    We studied electron-beam computed tomography (EBCT) findings in the left atrial appendage of 72 patients with cerebral embolism [27 in the acute phase (<48 hours) and 45 in the chronic phase], 9 cases with nonvalvular atrial fibrillation (NVAF) but without stroke, and 13 controls. EBCT was performed in the early (during injection of contrast medium), late-1 (5 min after injection), and-2 (10 min after injection) phases. In the acute phase patients, 41% showed filling defect (FD) in the early phase alone (FDE), 15% showed FD until late phase-1 (FDL-1), and 15% showed FD until late phase-2 (FDL-2). The chronic phase patients showed FDE in 33% of cases, FDL-1 in 8% and FDL-2 11%. Only FDE was observed in 44% in NVAF cases without stroke. No FDs were observed in controls. Flow velocity in the appendage measured by transesophageal echocardiography was 23{+-}10 cm/sec in 21 FDE cases, 14{+-}3 cm/sec in 3 FDL-1 cases, 29{+-}23 cm/sec in 4 FDL-2 cases, significantly lower in comparison with 58{+-}25 cm/s in the 23 cases with no FD. FDL-1 and -2 suggested severe stasis or presence of thrombus in the appendage, which indicated high risk of embolism slower the movement of MES through the sample volume. (author)

  5. Noninvasive detection of coronary artery bypass graft patency by intravenous electron beam computed tomographic angiography.

    Science.gov (United States)

    Yamakami, Shoji; Toyama, Junji; Okamoto, Mitsuhiro; Matsushita, Toyoaki; Murakami, Yoshimasa; Ogata, Masaki; Ito, Shigenori; Fukutomi, Tatsuya; Okayama, Naotsuka; Itoh, Makoto

    2003-11-01

    This study evaluates the usefullness of intravenous electron beam computed tomographic angiography (EBA) for the detection of coronary artery bypass graft patency in 43 patients (33 men and 10 women, mean age, 65 years) who had coronary artery bypass graft surgery. EBA was performed a few days before selective bypass graft angiography (SGA). Forty axial cross-sections of angiographic images of the heart were acquired consecutively by an electrocardiographic trigger signal at 40% of the RR interval, which corresponds to the end-systolic phase. EBA data were reconstructed as a three-dimensional shaded surface display of the heart and bypass grafts. Detectability of the patency of bypass gratis was evaluated, taking selective angiographic images of the bypass grafts as a gold standard. One hundred and nine grafts (96%) out of 114 grafts were subjected to evaluation: 37 grafts were left internal mammary artery grafts (LIMA), 7 were right internal mammary artery grafts (RIMA), 6 were gastroepiploic artery grafts (GEA), 7 were free gastroepiploic artery grafts with venous drainage (free-GEA), 7 were radial artery grafts (RAG), and 45 were saphenous vein gratis (SVG). The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of EBA were 98%, 100%, 100%, 91%, and 98%, respectively. EBA sampled at the end-systolic period was determined to be useful for the detection of coronary artery bypass graft patency and occlusion.

  6. CT angiography using electron-beam computed tomography (EBCT). A phantom study

    International Nuclear Information System (INIS)

    Uchino, Akira; Kato, Akira; Kudo, Sho

    1997-01-01

    The purpose of this study was to evaluate the accuracy of CT angiography in small vessels using electron-beam computed tomography (EBCT). Vessel phantoms with inner diameters of 8 mm, 6 mm, and 4 mm were prepared with segments of 75%, 50%, and 25% stenosis in each vessel. The vessels were filled with contrast medium (Iopamidol 300 at 1/24 dilution, approximately 380 HU). The EBCT apparatus used was an Imatron C-150. The step volume scan mode was used with slice thicknesses of 1.5 mm and 3.0 mm, scan time of 0.3 sec, and 210 mm field of view. Images with a slice thickness of 1.5 mm were definitely better than those with a slice thickness of 3.0 mm. The quality of maximum intensity projection (MIP) images was quite similar to that of three-dimensional (3D) images. Using the 8 mm vessel phantom, all stenotic segments were accurately visualized on CT angiography. The 50% stenotic segments were accurately estimated in all vessels. However, the 75% stenotic segments were slightly overestimated in smaller vessels, and the 25% stenotic segments were slightly underestimated in smaller vessels. We consider CT angiography using EBCT to be a useful, less invasive diagnostic modality for stenoocclusive lesions. (author)

  7. Clinical application of electron beam computed tomography in diagnosis of truncus arteriosus

    International Nuclear Information System (INIS)

    Zhang Gejun; Dai Ruping; Cao Cheng; Qi Xiaoou; Bai Hua; Ma Zhanhong; Chen Yao; Mu Feng; Ren Li

    2005-01-01

    Objective: To evaluate value of electron beam computed tomography (EBCT) in diagnosis of truncus arteriosus (TA). Methods: Ten cases of TA with age ranging from 2-month to 24 years were studied. All cases were examined and diagnosed with Imatron C-150 scanner using contrastmedia. The results of EBCT were analyzed and compared with the results of echocardiography (in 10 cases), cardiovascular angiography (in 3 cases) and surgery findings (in 1 case ). Results: EBCT yielded qualitative diagnosis and classification in all 10 cases. Echocardiography revealed qualitative diagnosis in 9 cases, however its classification was accordant to EBCT just in 5 cases. The concomitant abnormalities of TA were found more with EBCT than that with echocardiography. Cardiovascular angiography was performed in 3 cases, yielding inaccurate classification 2 cases. One case of TA was operated just based on the results of echocardiography, EBCT and catheterization. Conclusion: As a noninvasive method, EBCT could yield qualitative diagnosis of TA as well as classification. The results of EBCT examination combining echocardiography and catheterization could guide the operations. (authors)

  8. Electron-beam computed tomography findings of left atrial appendage in patients with cardiogenic cerebral embolism

    International Nuclear Information System (INIS)

    Okamoto, Makiko; Takahashi, Satoshi; Yonezawa, Hisashi

    2002-01-01

    We studied electron-beam computed tomography (EBCT) findings in the left atrial appendage of 72 patients with cerebral embolism [27 in the acute phase (<48 hours) and 45 in the chronic phase], 9 cases with nonvalvular atrial fibrillation (NVAF) but without stroke, and 13 controls. EBCT was performed in the early (during injection of contrast medium), late-1 (5 min after injection), and-2 (10 min after injection) phases. In the acute phase patients, 41% showed filling defect (FD) in the early phase alone (FDE), 15% showed FD until late phase-1 (FDL-1), and 15% showed FD until late phase-2 (FDL-2). The chronic phase patients showed FDE in 33% of cases, FDL-1 in 8% and FDL-2 11%. Only FDE was observed in 44% in NVAF cases without stroke. No FDs were observed in controls. Flow velocity in the appendage measured by transesophageal echocardiography was 23±10 cm/sec in 21 FDE cases, 14±3 cm/sec in 3 FDL-1 cases, 29±23 cm/sec in 4 FDL-2 cases, significantly lower in comparison with 58±25 cm/s in the 23 cases with no FD. FDL-1 and -2 suggested severe stasis or presence of thrombus in the appendage, which indicated high risk of embolism slower the movement of MES through the sample volume. (author)

  9. Examination of Scanning Electron Microscope and Computed Tomography Images of PICA

    Science.gov (United States)

    Lawson, John W.; Stackpoole, Margaret M.; Shklover, Valery

    2010-01-01

    Micrographs of PICA (Phenolic Impregnated Carbon Ablator) taken using a Scanning Electron Microscope (SEM) and 3D images taken with a Computed Tomography (CT) system are examined. PICA is a carbon fiber based composite (Fiberform ) with a phenolic polymer matrix. The micrographs are taken at different surface depths and at different magnifications in a sample after arc jet testing and show different levels of oxidative removal of the charred matrix (Figs 1 though 13). CT scans, courtesy of Xradia, Inc. of Concord CA, were captured for samples of virgin PICA, charred PICA and raw Fiberform (Fig. 14). We use these images to calculate the thermal conductivity (TC) of these materials using correlation function (CF) methods. CF methods give a mathematical description of how one material is embedded in another and is thus ideally suited for modeling composites like PICA. We will evaluate how the TC of the materials changes as a function of surface depth. This work is in collaboration with ETH-Zurich, which has expertise in high temperature materials and TC modeling (including CF methods).

  10. Nanoscale RRAM-based synaptic electronics: toward a neuromorphic computing device.

    Science.gov (United States)

    Park, Sangsu; Noh, Jinwoo; Choo, Myung-Lae; Sheri, Ahmad Muqeem; Chang, Man; Kim, Young-Bae; Kim, Chang Jung; Jeon, Moongu; Lee, Byung-Geun; Lee, Byoung Hun; Hwang, Hyunsang

    2013-09-27

    Efforts to develop scalable learning algorithms for implementation of networks of spiking neurons in silicon have been hindered by the considerable footprints of learning circuits, which grow as the number of synapses increases. Recent developments in nanotechnologies provide an extremely compact device with low-power consumption.In particular, nanoscale resistive switching devices (resistive random-access memory (RRAM)) are regarded as a promising solution for implementation of biological synapses due to their nanoscale dimensions, capacity to store multiple bits and the low energy required to operate distinct states. In this paper, we report the fabrication, modeling and implementation of nanoscale RRAM with multi-level storage capability for an electronic synapse device. In addition, we first experimentally demonstrate the learning capabilities and predictable performance by a neuromorphic circuit composed of a nanoscale 1 kbit RRAM cross-point array of synapses and complementary metal-oxide-semiconductor neuron circuits. These developments open up possibilities for the development of ubiquitous ultra-dense, ultra-low-power cognitive computers.

  11. Nanoscale RRAM-based synaptic electronics: toward a neuromorphic computing device

    International Nuclear Information System (INIS)

    Park, Sangsu; Noh, Jinwoo; Choo, Myung-lae; Sheri, Ahmad Muqeem; Jeon, Moongu; Lee, Byung-Geun; Lee, Byoung Hun; Chang, Man; Kim, Young-Bae; Kim, Chang Jung; Hwang, Hyunsang

    2013-01-01

    Efforts to develop scalable learning algorithms for implementation of networks of spiking neurons in silicon have been hindered by the considerable footprints of learning circuits, which grow as the number of synapses increases. Recent developments in nanotechnologies provide an extremely compact device with low-power consumption. In particular, nanoscale resistive switching devices (resistive random-access memory (RRAM)) are regarded as a promising solution for implementation of biological synapses due to their nanoscale dimensions, capacity to store multiple bits and the low energy required to operate distinct states. In this paper, we report the fabrication, modeling and implementation of nanoscale RRAM with multi-level storage capability for an electronic synapse device. In addition, we first experimentally demonstrate the learning capabilities and predictable performance by a neuromorphic circuit composed of a nanoscale 1 kbit RRAM cross-point array of synapses and complementary metal–oxide–semiconductor neuron circuits. These developments open up possibilities for the development of ubiquitous ultra-dense, ultra-low-power cognitive computers. (paper)

  12. External cervical resorption: an analysis using cone beam and microfocus computed tomography and scanning electron microscopy.

    Science.gov (United States)

    Gunst, V; Mavridou, A; Huybrechts, B; Van Gorp, G; Bergmans, L; Lambrechts, P

    2013-09-01

    To provide a three-dimensional representation of external cervical resorption (ECR) with microscopy, stereo microscopy, cone beam computed tomography (CT), microfocus CT and scanning electron microscopy (SEM). External cervical resorption is an aggressive form of root resorption, leading to a loss of dental hard tissues. This is due to clastic action, activated by a damage of the covering cementum and stimulated probably by infection. Clinically, it is a challenging situation as it is characterized by a late symptomatology. This is due to the pericanalar protection from a resorption-resistant sheet, composed of pre-dentine and surrounding dentine. The clastic activity is often associated with an attempt to repair, seen by the formation of osteoid tissue. Cone beam CT is extremely useful in the diagnoses and treatment planning of ECR. SEM analyses provide a better insight into the activity of osteoclasts. The root canal is surrounded by a layer of dentine that is resistant to resorption. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  13. Comprehensive evaluation of anomalous pulmonary venous connection by electron beam computed tomography as compared with ultrasound

    International Nuclear Information System (INIS)

    Zhang Shaoxiong; Dai Ruping; Bai Hua; He Sha; Jing Baolian

    1999-01-01

    Objective: To investigate the clinical value of electron beam computed tomography (EBCT) in diagnosis of anomalous pulmonary venous connection. Methods: Retrospective analysis on 14 cases with anomalous pulmonary venous connection was performed using EBCT volume scan. The slice thickness and scan time were 3 mm and 100 ms respectively. Non-ionic contrast medium was applied. Three dimensional reconstruction of EBCT images were carried out on all cases. Meanwhile, ultrasound echocardiography was performed on all patients. Conventional cardiovascular angiography was performed on 8 patients and 2 cases received operation. Results: Ten patients with total anomalous pulmonary venous connection, including 6 cases of supra-cardiac type and 4 cases of cardiac type, were proved by EBCT examination. Among them, 3 cases of abnormal pulmonary venous drainage were not revealed by conventional cardiovascular angiography. Among four patients with partial pulmonary venous connection, including cardiac type in 2 cases, supra-cardiac type and infra-cardiac type in 1 case respectively, only one of them was demonstrated by echocardiography. Conclusion: EBCT has significant value in diagnosis of anomalous pulmonary venous connection which may not be detectable with echocardiography or even cardiovascular angiography

  14. A computer control system for the PNC high power cw electron linac. Concept and hardware

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, T.; Hirano, K.; Takei, Hayanori; Nomura, Masahiro; Tani, S. [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Kato, Y.; Ishikawa, Y.

    1998-06-01

    Design and construction of a high power cw (Continuous Wave) electron linac for studying feasibility of nuclear waste transmutation was started in 1989 at PNC. The PNC accelerator (10 MeV, 20 mA average current, 4 ms pulse width, 50 Hz repetition) is dedicated machine for development of the high current acceleration technology in future need. The computer control system is responsible for accelerator control and supporting the experiment for high power operation. The feature of the system is the measurements of accelerator status simultaneously and modularity of software and hardware for easily implemented for modification or expansion. The high speed network (SCRAM Net {approx} 15 MB/s), Ethernet, and front end processors (Digital Signal Processor) were employed for the high speed data taking and control. The system was designed to be standard modules and software implemented man machine interface. Due to graphical-user-interface and object-oriented-programming, the software development environment is effortless programming and maintenance. (author)

  15. A Computer-Based, Interactive Videodisc Job Aid and Expert System for Electron Beam Lithography Integration and Diagnostic Procedures.

    Science.gov (United States)

    Stevenson, Kimberly

    This master's thesis describes the development of an expert system and interactive videodisc computer-based instructional job aid used for assisting in the integration of electron beam lithography devices. Comparable to all comprehensive training, expert system and job aid development require a criterion-referenced systems approach treatment to…

  16. Formulation of a Mesoscopic Electron Beam Splitter with Application in Semiconductor Based Quantum Computing

    OpenAIRE

    Shanker, A.; Bhowmik, D.; Bhattacharya, T. K.

    2010-01-01

    We aim to analytically arrive at a beam splitter formulation for electron waves. The electron beam splitter is an essential component of quantum logical devices. To arrive at the beam splitter structure, the electrons are treated as waves, i.e. we assume the transport to be ballistic. Ballistic electrons are electrons that travel over such short distances that their phase coherence is maintained. For mesoscopic devices with size smaller than the mean free path, the phase relaxation length and...

  17. An effective chaos-geometric computational approach to analysis and prediction of evolutionary dynamics of the environmental systems: Atmospheric pollution dynamics

    Science.gov (United States)

    Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Bunyakova, Yu Ya; Florko, T. A.; Agayar, E. V.; Solyanikova, E. P.

    2017-10-01

    The present paper concerns the results of computational studying dynamics of the atmospheric pollutants (dioxide of nitrogen, sulphur etc) concentrations in an atmosphere of the industrial cities (Odessa) by using the dynamical systems and chaos theory methods. A chaotic behaviour in the nitrogen dioxide and sulphurous anhydride concentration time series at several sites of the Odessa city is numerically investigated. As usually, to reconstruct the corresponding attractor, the time delay and embedding dimension are needed. The former is determined by the methods of autocorrelation function and average mutual information, and the latter is calculated by means of a correlation dimension method and algorithm of false nearest neighbours. Further, the Lyapunov’s exponents spectrum, Kaplan-Yorke dimension and Kolmogorov entropy are computed. It has been found an existence of a low-D chaos in the time series of the atmospheric pollutants concentrations.

  18. Computational Study on Atomic Structures, Electronic Properties, and Chemical Reactions at Surfaces and Interfaces and in Biomaterials

    Science.gov (United States)

    Takano, Yu; Kobayashi, Nobuhiko; Morikawa, Yoshitada

    2018-06-01

    Through computer simulations using atomistic models, it is becoming possible to calculate the atomic structures of localized defects or dopants in semiconductors, chemically active sites in heterogeneous catalysts, nanoscale structures, and active sites in biological systems precisely. Furthermore, it is also possible to clarify physical and chemical properties possessed by these nanoscale structures such as electronic states, electronic and atomic transport properties, optical properties, and chemical reactivity. It is sometimes quite difficult to clarify these nanoscale structure-function relations experimentally and, therefore, accurate computational studies are indispensable in materials science. In this paper, we review recent studies on the relation between local structures and functions for inorganic, organic, and biological systems by using atomistic computer simulations.

  19. Transabdominal ultrasonography, computed tomography and electronic portal imaging for 3-dimensional conformal radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    Jereczek-Fossa, B.A.; Orecchia, R.; Cattani, F.; Garibaldi, C.; Cambria, R.; Valenti, M.; Ciocca, M.; Zerini, D.; Boboc, G.I.; Vavassori, A.; Ivaldi, G.B.; Kowalczyk, A.; Matei, D.V.; Cobelli, O. de

    2007-01-01

    Purpose: To evaluate the feasibility and accuracy of daily B-mode acquisition and targeting ultrasound-based prostate localization (BAT trademark) and to compare it with computed tomography (CT) and electronic portal imaging (EPI) in 3-dimensional conformal radiotherapy (3-D CRT) for prostate cancer. Patients and Methods: Ten patients were treated with 3-D CRT (72 Gy/30 fractions, 2.4 Gy/fraction, equivalent to 80 Gy/40 fractions, for α/β ratio of 1.5 Gy) and daily BAT-based prostate localization. For the first 5 fractions, CT and EPI were also performed in order to compare organ-motion and set-up error, respectively. Results: 287 BAT-, 50 CT- and 46 EPI-alignments were performed. The average BAT-determined misalignments in latero-lateral, antero-posterior and cranio-caudal directions were -0.9 mm ± 3.3 mm, 1.0 mm ± 4.0 mm and -0.9 mm ± 3.8 mm, respectively. The differences between BAT- and CT-determined organ-motion in latero-lateral, antero-posterior and cranio-caudal directions were 2.7 mm ± 1.9 mm, 3.9 ± 2.8 mm and 3.4 ± 3.0 mm, respectively. Weak correlation was found between BAT- and CT-determined misalignments in antero-posterior direction, while no correlation was observed in latero-lateral and cranio-caudal directions. The correlation was more significant when only data of good image-quality patients were analyzed (8 patients). Conclusion: BAT ensures the relative positions of target are the same during treatment and in treatment plan, however, the reliability of alignment is patient-dependent. The average BAT-determined misalignments were small, confirming the prevalence of random errors in 3-D CRT. Further study is warranted in order to establish the clinical value of BAT. (orig.)

  20. Electronic nature of zwitterionic alkali metal methanides, silanides and germanides - a combined experimental and computational approach.

    Science.gov (United States)

    Li, H; Aquino, A J A; Cordes, D B; Hase, W L; Krempner, C

    2017-02-01

    Zwitterionic group 14 complexes of the alkali metals of formula [C(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 1 ), [Si(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 2 ), [Ge(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 3 ), where M = Li, Na or K, have been prepared, structurally characterized and their electronic nature was investigated by computational methods. Zwitterions M- 2 and M- 3 were synthesized via reactions of [Si(SiMe 2 OCH 2 CH 2 OMe) 4 ] ( 2 ) and [Ge(SiMe 2 OCH 2 CH 2 OMe) 4 ] ( 3 ) with MOBu t (M = Li, Na or K), resp., in almost quantitative yields, while M- 1 were prepared from deprotonation of [HC(SiMe 2 OCH 2 CH 2 OMe) 3 ] ( 1 ) with LiBu t , NaCH 2 Ph and KCH 2 Ph, resp. X-ray crystallographic studies and DFT calculations in the gas-phase, including calculations of the NPA charges confirm the zwitterionic nature of these compounds, with the alkali metal cations being rigidly locked and charge separated from the anion by the internal OCH 2 CH 2 OMe donor groups. Natural bond orbital (NBO) analysis and the second order perturbation theory analysis of the NBOs reveal significant hyperconjugative interactions in M- 1 -M- 3 , primarily between the lone pair and the antibonding Si-O orbitals, the extent of which decreases in the order M- 1 > M- 2 > M- 3 . The experimental basicities and the calculated gas-phase basicities of M- 1 -M- 3 reveal the zwitterionic alkali metal methanides M- 1 to be significantly stronger bases than the analogous silanides M- 2 and germanium M- 3 .

  1. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    International Nuclear Information System (INIS)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  2. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  3. Use of electronic computers for processing of spectrometric data in instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.

    1977-01-01

    A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered

  4. Comparison of Property-Oriented Basis Sets for the Computation of Electronic and Nuclear Relaxation Hyperpolarizabilities.

    Science.gov (United States)

    Zaleśny, Robert; Baranowska-Łączkowska, Angelika; Medveď, Miroslav; Luis, Josep M

    2015-09-08

    In the present work, we perform an assessment of several property-oriented atomic basis sets in computing (hyper)polarizabilities with a focus on the vibrational contributions. Our analysis encompasses the Pol and LPol-ds basis sets of Sadlej and co-workers, the def2-SVPD and def2-TZVPD basis sets of Rappoport and Furche, and the ORP basis set of Baranowska-Łączkowska and Łączkowski. Additionally, we use the d-aug-cc-pVQZ and aug-cc-pVTZ basis sets of Dunning and co-workers to determine the reference estimates of the investigated electric properties for small- and medium-sized molecules, respectively. We combine these basis sets with ab initio post-Hartree-Fock quantum-chemistry approaches (including the coupled cluster method) to calculate electronic and nuclear relaxation (hyper)polarizabilities of carbon dioxide, formaldehyde, cis-diazene, and a medium-sized Schiff base. The primary finding of our study is that, among all studied property-oriented basis sets, only the def2-TZVPD and ORP basis sets yield nuclear relaxation (hyper)polarizabilities of small molecules with average absolute errors less than 5.5%. A similar accuracy for the nuclear relaxation (hyper)polarizabilites of the studied systems can also be reached using the aug-cc-pVDZ basis set (5.3%), although for more accurate calculations of vibrational contributions, i.e., average absolute errors less than 1%, the aug-cc-pVTZ basis set is recommended. It was also demonstrated that anharmonic contributions to first and second hyperpolarizabilities of a medium-sized Schiff base are particularly difficult to accurately predict at the correlated level using property-oriented basis sets. For instance, the value of the nuclear relaxation first hyperpolarizability computed at the MP2/def2-TZVPD level of theory is roughly 3 times larger than that determined using the aug-cc-pVTZ basis set. We link the failure of the def2-TZVPD basis set with the difficulties in predicting the first-order field

  5. Molluscan Evolutionary Genomics

    Energy Technology Data Exchange (ETDEWEB)

    Simison, W. Brian; Boore, Jeffrey L.

    2005-12-01

    In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the early 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.

  6. Evaluation of models generated via hybrid evolutionary algorithms ...

    African Journals Online (AJOL)

    2016-04-02

    Apr 2, 2016 ... Evaluation of models generated via hybrid evolutionary algorithms for the prediction of Microcystis ... evolutionary algorithms (HEA) proved to be highly applica- ble to the hypertrophic reservoirs of South Africa. .... discovered and optimised using a large-scale parallel computational device and relevant soft-.

  7. Origins of evolutionary transitions

    Indian Academy of Sciences (India)

    2014-03-15

    Mar 15, 2014 ... ... of events: 'Entities that were capable of independent replication ... There have been many major evolutionary events that this definition of .... selection at level x to exclusive selection at x – will probably require a multiplicity ...

  8. Evolutionary relationships among Astroviridae

    NARCIS (Netherlands)

    Lukashov, Vladimir V.; Goudsmit, Jaap

    2002-01-01

    To study the evolutionary relationships among astroviruses, all available sequences for members of the family Astroviridae were collected. Phylogenetic analysis distinguished two deep-rooted groups: one comprising mammalian astroviruses, with ovine astrovirus being an outlier, and the other

  9. A model and computer code for the Monte Carlo simulation of relativistic electron and positron penetration through matter

    International Nuclear Information System (INIS)

    Ismail, M.; Liljequist, D.

    1986-10-01

    In the present model, the treatment of elastic scattering is based on the similarity of multiple scattering processes with equal transport mean free path /LAMBDA/sub(tr). Elastic scattering events are separated by an artificially enlarged mean free path. In such events, scattering is optionally performed either by means of a single, energy-dependent scattering angle, or by means of a scattering angle distribution of the same form as the screened Rutherford cross section, but with an artificial screening factor. The physically correct /LAMBDA/sub(tr) value is obtained by appropriate choice of scattering angle or screening factor, respectively. We find good agreement with experimental transmission and with energy loss distributions. The Rutherford-like model gives good agreement with experimental angular distribution even for the penetration of very thin layers. Treatment of electron energy loss is based on the partial CSDA method: energy losses W WMINSE are treated as discrete electron-electron or positron-electron scattering events. Similarly, for bremsstrahlung photon energies W WMINR are treated at discrete events. The sensitivity of the model to the parameters WMINSE and WMINR is studied. WMINR can, in practise, be made negligibly small, and WMINSE can without any excessive computer time be made as small as to give results in good agreement with experiment and with computations based on Landau theory of straggling. Using this model, we study some of the characteristic features of relativistic electron transmission, energy loss distributions, straggling, angular distributions and trajectories. (authors)

  10. Evolutionary Multiplayer Games

    OpenAIRE

    Gokhale, Chaitanya S.; Traulsen, Arne

    2014-01-01

    Evolutionary game theory has become one of the most diverse and far reaching theories in biology. Applications of this theory range from cell dynamics to social evolution. However, many applications make it clear that inherent non-linearities of natural systems need to be taken into account. One way of introducing such non-linearities into evolutionary games is by the inclusion of multiple players. An example is of social dilemmas, where group benefits could e.g.\\ increase less than linear wi...

  11. x-y-recording in transmission electron microscopy. A versatile and inexpensive interface to personal computers with application to stereology.

    Science.gov (United States)

    Rickmann, M; Siklós, L; Joó, F; Wolff, J R

    1990-09-01

    An interface for IBM XT/AT-compatible computers is described which has been designed to read the actual specimen stage position of electron microscopes. The complete system consists of (i) optical incremental encoders attached to the x- and y-stage drivers of the microscope, (ii) two keypads for operator input, (iii) an interface card fitted to the bus of the personal computer, (iv) a standard configuration IBM XT (or compatible) personal computer optionally equipped with a (v) HP Graphic Language controllable colour plotter. The small size of the encoders and their connection to the stage drivers by simple ribbed belts allows an easy adaptation of the system to most electron microscopes. Operation of the interface card itself is supported by any high-level language available for personal computers. By the modular concept of these languages, the system can be customized to various applications, and no computer expertise is needed for actual operation. The present configuration offers an inexpensive attachment, which covers a wide range of applications from a simple notebook to high-resolution (200-nm) mapping of tissue. Since section coordinates can be processed in real-time, stereological estimations can be derived directly "on microscope". This is exemplified by an application in which particle numbers were determined by the disector method.

  12. The Impacts of Attitudes and Engagement on Electronic Word of Mouth (eWOM) of Mobile Sensor Computing Applications

    Science.gov (United States)

    Zhao, Yu; Liu, Yide; Lai, Ivan K. W.; Zhang, Hongfeng; Zhang, Yi

    2016-01-01

    As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human–computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users’ compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user’s compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user—product (brand) relationships. PMID:26999155

  13. The Impacts of Attitudes and Engagement on Electronic Word of Mouth (eWOM of Mobile Sensor Computing Applications

    Directory of Open Access Journals (Sweden)

    Yu Zhao

    2016-03-01

    Full Text Available As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human–computer interaction (HCI issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users’ compound attitudes (i.e., affective attitudes, cognitive attitude, engagement and electronic word of mouth (eWOM behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user’s compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user—product (brand relationships.

  14. The Impacts of Attitudes and Engagement on Electronic Word of Mouth (eWOM) of Mobile Sensor Computing Applications.

    Science.gov (United States)

    Zhao, Yu; Liu, Yide; Lai, Ivan K W; Zhang, Hongfeng; Zhang, Yi

    2016-03-18

    As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human-computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users' compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user's compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user-product (brand) relationships.

  15. Computational algorithms for analysis of data from thin-film thermoresistors on a radio-electronic printed circuit board

    International Nuclear Information System (INIS)

    Korneeva, Anna; Shaydurov, Vladimir

    2016-01-01

    In the paper, the data analysis is considered for thin-film thermoresistors coated on to a radio-electronic printed circuit board to determine possible zones of its overheating. A mathematical model consists in an underdetermined system of linear algebraic equations with an infinite set of solutions. For computing a more real solution, two additional conditions are used: the smoothness of a solution and the positiveness of an increase of temperature during overheating. Computational experiments demonstrate that an overheating zone is determined exactly with a tolerable accuracy of temperature in it.

  16. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  17. Computational Biophysical, Biochemical, and Evolutionary Signature of Human R-Spondin Family Proteins, the Member of Canonical Wnt/β-Catenin Signaling Pathway

    Directory of Open Access Journals (Sweden)

    Ashish Ranjan Sharma

    2014-01-01

    Full Text Available In human, Wnt/β-catenin signaling pathway plays a significant role in cell growth, cell development, and disease pathogenesis. Four human (Rspos are known to activate canonical Wnt/β-catenin signaling pathway. Presently, (Rspos serve as therapeutic target for several human diseases. Henceforth, basic understanding about the molecular properties of (Rspos is essential. We approached this issue by interpreting the biochemical and biophysical properties along with molecular evolution of (Rspos thorough computational algorithm methods. Our analysis shows that signal peptide length is roughly similar in (Rspos family along with similarity in aa distribution pattern. In Rspo3, four N-glycosylation sites were noted. All members are hydrophilic in nature and showed alike GRAVY values, approximately. Conversely, Rspo3 contains the maximum positively charged residues while Rspo4 includes the lowest. Four highly aligned blocks were recorded through Gblocks. Phylogenetic analysis shows Rspo4 is being rooted with Rspo2 and similarly Rspo3 and Rspo1 have the common point of origin. Through phylogenomics study, we developed a phylogenetic tree of sixty proteins (n=60 with the orthologs and paralogs seed sequences. Protein-protein network was also illustrated. Results demonstrated in our study may help the future researchers to unfold significant physiological and therapeutic properties of (Rspos in various disease models.

  18. Facilitating the design and operation of computer-controlled radiochemistry synthesizers with an open-quotes Electronic Toolboxclose quotes

    International Nuclear Information System (INIS)

    Feliu, A.L.

    1991-01-01

    Positron emission tomography (PET) is a non-invasive diagnostic imaging technique requiring rapid and reliable radiopharmaceutical production. Automated systems offer a host of potential advantages over manually or remotely operated apparatus, including reduced personnel requirements, lower radiation exposure to personel, reliable yields, and reproducible product purity. However, the burden of routine radiopharmaceutical production most often remains a labor-intensive responsibility of highly trained radiochemists. In order to ease the transition between manual, remote-controlled, and computer-controlled radiochemical synthesis, an electronic toolbox with graphical user interface was developed as a generic process control system compatible with a variety of common radiochemical operations. This work is specifically aimed to make automated techniques more accessible by emphasizing the similarities between manual and automated chemistry and by minimizing the computer programming effort required. This paper discusses the structural elements of the electronic toolbox approach to radiochemistry process control, and its ramifications for the designers and end-users of automated synthesizers

  19. Computational details of the Monte Carlo simulation of proton and electron tracks

    International Nuclear Information System (INIS)

    Zaider, M.; Brenner, D.J.

    1983-01-01

    The code PROTON simulates the elastic and nonelastic interactions of protons and electrons in water vapor. In this paper, the treatment of elastic angular scattering of electrons as utilized in PROTON is described and compared with alternate formalisms. The sensitivity of the calculation to different treatments of this process is examined in terms of proximity functions of energy deposition. 5 figures

  20. Stretchable, Twisted Conductive Microtubules for Wearable Computing, Robotics, Electronics, and Healthcare.

    Science.gov (United States)

    Do, Thanh Nho; Visell, Yon

    2017-05-11

    Stretchable and flexible multifunctional electronic components, including sensors and actuators, have received increasing attention in robotics, electronics, wearable, and healthcare applications. Despite advances, it has remained challenging to design analogs of many electronic components to be highly stretchable, to be efficient to fabricate, and to provide control over electronic performance. Here, we describe highly elastic sensors and interconnects formed from thin, twisted conductive microtubules. These devices consist of twisted assemblies of thin, highly stretchable (>400%) elastomer tubules filled with liquid conductor (eutectic gallium indium, EGaIn), and fabricated using a simple roller coating process. As we demonstrate, these devices can operate as multimodal sensors for strain, rotation, contact force, or contact location. We also show that, through twisting, it is possible to control their mechanical performance and electronic sensitivity. In extensive experiments, we have evaluated the capabilities of these devices, and have prototyped an array of applications in several domains of stretchable and wearable electronics. These devices provide a novel, low cost solution for high performance stretchable electronics with broad applications in industry, healthcare, and consumer electronics, to emerging product categories of high potential economic and societal significance.

  1. Computer simulations of upper-hybrid and electron cyclotron resonance heating

    International Nuclear Information System (INIS)

    Lin, A.T.; Lin, C.C.

    1983-01-01

    A 2 1/2 -dimensional relativistic electromagnetic particle code is used to investigate the dynamic behavior of electron heating around the electron cyclotron and upper-hybrid layers when an extraordinary wave is obliquely launched from the high-field side into a magnetized plasma. With a large angle of incidence most of the radiation wave energy converts into electrostatic electron Bernstein waves at the upper-hybrid layer. These mode-converted waves propagate back to the cyclotron layer and deposit their energy in the electrons through resonant interactions dominated first by the Doppler broadening and later by the relativistic mass correction. The line shape for both mechanisms has been observed in the simulations. At a later stage, the relativistic resonance effects shift the peak of the temperature profile to the high-field side. The heating ultimately causes the extraordinary wave to be substantially absorbed by the high-energy electrons. The steep temperature gradient created by the electron cyclotron heating eventually reflects a substantial part of the incident wave energy. The diamagnetic effects due to the gradient of the mode-converted Bernstein wave pressure enhance the spreading of the electron heating from the original electron cyclotron layer

  2. Evolutionary Algorithms Application Analysis in Biometric Systems

    Directory of Open Access Journals (Sweden)

    N. Goranin

    2010-01-01

    Full Text Available Wide usage of biometric information for person identity verification purposes, terrorist acts prevention measures and authenticationprocess simplification in computer systems has raised significant attention to reliability and efficiency of biometricsystems. Modern biometric systems still face many reliability and efficiency related issues such as reference databasesearch speed, errors while recognizing of biometric information or automating biometric feature extraction. Current scientificinvestigations show that application of evolutionary algorithms may significantly improve biometric systems. In thisarticle we provide a comprehensive review of main scientific research done in sphere of evolutionary algorithm applicationfor biometric system parameter improvement.

  3. Langley's CSI evolutionary model: Phase O

    Science.gov (United States)

    Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.; Bailey, Jim P.; Bruner, Anne M.; Sulla, Jeffrey L.; Won, John; Ugoletti, Roberto M.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology to improve space science platform pointing is described. The evolutionary nature of the testbed will permit the study of global line-of-sight pointing in phases 0 and 1, whereas, multipayload pointing systems will be studied beginning with phase 2. The design, capabilities, and typical dynamic behavior of the phase 0 version of the CSI evolutionary model (CEM) is documented for investigator both internal and external to NASA. The model description includes line-of-sight pointing measurement, testbed structure, actuators, sensors, and real time computers, as well as finite element and state space models of major components.

  4. Genomes, Phylogeny, and Evolutionary Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Monica

    2005-03-25

    With the completion of the human genome and the growing number of diverse genomes being sequenced, a new age of evolutionary research is currently taking shape. The myriad of technological breakthroughs in biology that are leading to the unification of broad scientific fields such as molecular biology, biochemistry, physics, mathematics and computer science are now known as systems biology. Here I present an overview, with an emphasis on eukaryotes, of how the postgenomics era is adopting comparative approaches that go beyond comparisons among model organisms to shape the nascent field of evolutionary systems biology.

  5. A Multistep Maturity Model for the Implementation of Electronic and Computable Diagnostic Clinical Prediction Rules (eCPRs).

    Science.gov (United States)

    Corrigan, Derek; McDonnell, Ronan; Zarabzadeh, Atieh; Fahey, Tom

    2015-01-01

    The use of Clinical Prediction Rules (CPRs) has been advocated as one way of implementing actionable evidence-based rules in clinical practice. The current highly manual nature of deriving CPRs makes them difficult to use and maintain. Addressing the known limitations of CPRs requires implementing more flexible and dynamic models of CPR development. We describe the application of Information and Communication Technology (ICT) to provide a platform for the derivation and dissemination of CPRs derived through analysis and continual learning from electronic patient data. We propose a multistep maturity model for constructing electronic and computable CPRs (eCPRs). The model has six levels - from the lowest level of CPR maturity (literaturebased CPRs) to a fully electronic and computable service-oriented model of CPRs that are sensitive to specific demographic patient populations. We describe examples of implementations of the core model components - focusing on CPR representation, interoperability, electronic dissemination, CPR learning, and user interface requirements. The traditional focus on derivation and narrow validation of CPRs has severely limited their wider acceptance. The evolution and maturity model described here outlines a progression toward eCPRs consistent with the vision of a learning health system (LHS) - using central repositories of CPR knowledge, accessible open standards, and generalizable models to avoid repetition of previous work. This is useful for developing more ambitious strategies to address limitations of the traditional CPR development life cycle. The model described here is a starting point for promoting discussion about what a more dynamic CPR development process should look like.

  6. Electron Fermi acceleration in collapsing magnetic traps: Computational and analytical models

    International Nuclear Information System (INIS)

    Gisler, G.; Lemons, D.

    1990-01-01

    The authors consider the heating and acceleration of electrons trapped on magnetic field lines between approaching magnetic mirrors. Such a collapsing magnetic trap and consequent electron energization can occur whenever a curved (or straight) flux tube drifts into a relatively straight (or curved) perpendicular shock. The relativistic, three-dimensional, collisionless test particle simulations show that an initial thermal electron distribution is bulk heated while a few individual electrons are accelerated to many times their original energy before they escape the trap. Upstream field-aligned beams and downstream pancake distributions perpendicular to the field are predicted. In the appropriate limit the simulation results agree well with a nonrelativistic analytic model of the distribution of escaping electrons which is based on the first adiabatic invariant and energy conservation between collisions with the mirrors. Space science and astrophysical applications are discussed

  7. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  8. Ab initio computation of electron affinities of substituted benzalacetophenones (chalcones): a new approach to substituent effects in organic electrochemistry

    International Nuclear Information System (INIS)

    Hicks, Latorya D.; Fry, Albert J.; Kurzweil, Vanessa C.

    2004-01-01

    The electron affinities (EAs) of a training set of 29 monosubstituted benzalacetophenones (chalcones) were computed at the ab initio density functional B3LYP/6-31G * level of theory. The EAs and experimental reduction potentials of the training set are highly linearly correlated (correlation coefficient of 0.969 and standard deviation of 10.8 mV). An additional 72 di-, tri-, and tetrasubstituted chalcones were then synthesized. Their reduction potentials were predicted from computed EAs using the linear correlation derived from the training set. Agreement between the experimental and computed reduction potentials is remarkably good, with a standard deviation of less than 22 mV for this very large set of substances whose potentials extend over a range of almost 700 mV

  9. Proteomics in evolutionary ecology.

    Science.gov (United States)

    Baer, B; Millar, A H

    2016-03-01

    Evolutionary ecologists are traditionally gene-focused, as genes propagate phenotypic traits across generations and mutations and recombination in the DNA generate genetic diversity required for evolutionary processes. As a consequence, the inheritance of changed DNA provides a molecular explanation for the functional changes associated with natural selection. A direct focus on proteins on the other hand, the actual molecular agents responsible for the expression of a phenotypic trait, receives far less interest from ecologists and evolutionary biologists. This is partially due to the central dogma of molecular biology that appears to define proteins as the 'dead-end of molecular information flow' as well as technical limitations in identifying and studying proteins and their diversity in the field and in many of the more exotic genera often favored in ecological studies. Here we provide an overview of a newly forming field of research that we refer to as 'Evolutionary Proteomics'. We point out that the origins of cellular function are related to the properties of polypeptide and RNA and their interactions with the environment, rather than DNA descent, and that the critical role of horizontal gene transfer in evolution is more about coopting new proteins to impact cellular processes than it is about modifying gene function. Furthermore, post-transcriptional and post-translational processes generate a remarkable diversity of mature proteins from a single gene, and the properties of these mature proteins can also influence inheritance through genetic and perhaps epigenetic mechanisms. The influence of post-transcriptional diversification on evolutionary processes could provide a novel mechanistic underpinning for elements of rapid, directed evolutionary changes and adaptations as observed for a variety of evolutionary processes. Modern state-of the art technologies based on mass spectrometry are now available to identify and quantify peptides, proteins, protein

  10. EMRlog method for computer security for electronic medical records with logic and data mining.

    Science.gov (United States)

    Martínez Monterrubio, Sergio Mauricio; Frausto Solis, Juan; Monroy Borja, Raúl

    2015-01-01

    The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.

  11. Parallel, distributed and GPU computing technologies in single-particle electron microscopy.

    Science.gov (United States)

    Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-07-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

  12. EMRlog Method for Computer Security for Electronic Medical Records with Logic and Data Mining

    Directory of Open Access Journals (Sweden)

    Sergio Mauricio Martínez Monterrubio

    2015-01-01

    Full Text Available The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.

  13. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.

  14. Applying Evolutionary Anthropology

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. PMID:25684561

  15. Experimental evaluation of quantum computing elements (qubits) made of electrons trapped over a liquid helium film

    International Nuclear Information System (INIS)

    Rousseau, E.

    2006-12-01

    An electron on helium presents a quantized energy spectrum. The interaction with the environment is considered sufficiently weak in order to allow the realization of a quantum bit (qubit) by using the first two energy levels. The first stage in the realization of this qubit was to trap and control a single electron. This is carried out thanks to a set of micro-fabricated electrodes defining a well of potential in which the electron is trapped. We are able with such a sample to trap and detect a variables number of electrons varying between one and around twenty. This then allowed us to study the static behaviour of a small number of electrons in a trap. They are supposed to crystallize and form structures called Wigner molecules. Such molecules have not yet been observed yet with electrons above helium. Our results bring circumstantial evidence for of Wigner crystallization. We then sought to characterize the qubit more precisely. We sought to carry out a projective reading (depending on the state of the qubit) and a measurement of the relaxation time. The results were obtained by exciting the electron with an incoherent electric field. A clean measurement of the relaxation time would require a coherent electric field. The conclusion cannot thus be final but it would seem that the relaxation time is shorter than calculated theoretically. That is perhaps due to a measurement of the relaxation between the oscillating states in the trap and not between the states of the qubit. (author)

  16. IMACS 󈨟: Proceedings of the IMACS World Congress on Computation and Applied Mathematics (13th) Held in Dublin, Ireland on July 22-26, 1991. Volume 2. Computational Fluid Dynamics and Wave Propagation, Parallel Computing, Concurrent and Supercomputing, Computational Physics/Computational Chemistry and Evolutionary Systems

    Science.gov (United States)

    1991-01-01

    Computation 14, 1000. sensible to allow-a small networ,’ to grow 𔃻 uring ear!y training, until a 27 XViI Pres, Bil-FMonnery SA Teukoisky, &VWT...Tecnologia Fot6nica, ETSI Telecomunicaci6n, Ciudad Universitaria- 28040 Madrid Spain Abstract.- Modelling of ferroelectric liquid crystal The optical

  17. Archaeogenetics in evolutionary medicine.

    Science.gov (United States)

    Bouwman, Abigail; Rühli, Frank

    2016-09-01

    Archaeogenetics is the study of exploration of ancient DNA (aDNA) of more than 70 years old. It is an important part of the wider studies of many different areas of our past, including animal, plant and pathogen evolution and domestication events. Hereby, we address specifically the impact of research in archaeogenetics in the broader field of evolutionary medicine. Studies on ancient hominid genomes help to understand even modern health patterns. Human genetic microevolution, e.g. related to abilities of post-weaning milk consumption, and specifically genetic adaptation in disease susceptibility, e.g. towards malaria and other infectious diseases, are of the upmost importance in contributions of archeogenetics on the evolutionary understanding of human health and disease. With the increase in both the understanding of modern medical genetics and the ability to deep sequence ancient genetic information, the field of archaeogenetic evolutionary medicine is blossoming.

  18. Paper versus computer: Feasibility of an electronic medical record in general pediatrics

    NARCIS (Netherlands)

    J. Roukema (Jolt); R.K. Los (Renske); S.E. Bleeker (Sacha); A.M. van Ginneken (Astrid); J. van der Lei (Johan); H.A. Moll (Henriëtte)

    2006-01-01

    textabstractBACKGROUND. Implementation of electronic medical record systems promises significant advances in patient care, because such systems enhance readability, availability, and data quality. Structured data entry (SDE) applications can prompt for completeness, provide greater accuracy and

  19. Electron transport parameters in CO$_2$: scanning drift tube measurements and kinetic computations

    OpenAIRE

    Vass, M.; Korolov, I.; Loffhagen, D.; Pinhao, N.; Donko, Z.

    2016-01-01

    This work presents transport coefficients of electrons (bulk drift velocity, longitudinal diffusion coefficient, and effective ionization frequency) in CO2 measured under time-of-flight conditions over a wide range of the reduced electric field, 15Td

  20. Efficient Computation of Coherent Synchrotron Radiation Taking into Account 6D Phase Space Distribution of Emitting Electrons

    International Nuclear Information System (INIS)

    Chubar, O.; Couprie, M.-E.

    2007-01-01

    CPU-efficient method for calculation of the frequency domain electric field of Coherent Synchrotron Radiation (CSR) taking into account 6D phase space distribution of electrons in a bunch is proposed. As an application example, calculation results of the CSR emitted by an electron bunch with small longitudinal and large transverse sizes are presented. Such situation can be realized in storage rings or ERLs by transverse deflection of the electron bunches in special crab-type RF cavities, i.e. using the technique proposed for the generation of femtosecond X-ray pulses (A. Zholents et. al., 1999). The computation, performed for the parameters of the SOLEIL storage ring, shows that if the transverse size of electron bunch is larger than the diffraction limit for single-electron SR at a given wavelength -- this affects the angular distribution of the CSR at this wavelength and reduces the coherent flux. Nevertheless, for transverse bunch dimensions up to several millimeters and a longitudinal bunch size smaller than hundred micrometers, the resulting CSR flux in the far infrared spectral range is still many orders of magnitude higher than the flux of incoherent SR, and therefore can be considered for practical use

  1. Literary and Electronic Hypertext: Borges, Criticism, Literary Research, and the Computer.

    Science.gov (United States)

    Davison, Ned J.

    1991-01-01

    Examines what "hypertext" means to literary criticism on the one hand (i.e., intertextuality) and computing on the other, to determine how the two concepts may serve each other in a mutually productive way. (GLR)

  2. A Novel Method for the Discrimination of Semen Arecae and Its Processed Products by Using Computer Vision, Electronic Nose, and Electronic Tongue

    Directory of Open Access Journals (Sweden)

    Min Xu

    2015-01-01

    Full Text Available Areca nut, commonly known locally as Semen Arecae (SA in China, has been used as an important Chinese herbal medicine for thousands of years. The raw SA (RAW is commonly processed by stir-baking to yellow (SBY, stir-baking to dark brown (SBD, and stir-baking to carbon dark (SBC for different clinical uses. In our present investigation, intelligent sensory technologies consisting of computer vision (CV, electronic nose (E-nose, and electronic tongue (E-tongue were employed in order to develop a novel and accurate method for discrimination of SA and its processed products. Firstly, the color parameters and electronic sensory responses of E-nose and E-tongue of the samples were determined, respectively. Then, indicative components including 5-hydroxymethyl furfural (5-HMF and arecoline (ARE were determined by HPLC. Finally, principal component analysis (PCA and discriminant factor analysis (DFA were performed. The results demonstrated that these three instruments can effectively discriminate SA and its processed products. 5-HMF and ARE can reflect the stir-baking degree of SA. Interestingly, the two components showed close correlations to the color parameters and sensory responses of E-nose and E-tongue. In conclusion, this novel method based on CV, E-nose, and E-tongue can be successfully used to discriminate SA and its processed products.

  3. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    Science.gov (United States)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  5. Computer control of the high-voltage power supply for the DIII-D electron cyclotron heating system

    International Nuclear Information System (INIS)

    Clow, D.D.; Kellman, D.H.

    1992-01-01

    This paper reports on the DIII-D Electron Cyclotron Heating (ECH) high voltage power supply which is controlled by a computer. Operational control is input via keyboard and mouse, and computer/power supply interfact is accomplished with a Computer Assisted Monitoring and Control (CAMAC) system. User-friendly tools allow the design and layout of simulated control panels on the computer screen. Panel controls and indicators can be changed, added or deleted, and simple editing of user-specific processes can quickly modify control and fault logic. Databases can be defined, and control panel functions are easily referred to various data channels. User-specific processes are written and linked using Fortran, to manage control and data acquisition through CAMAC. The resulting control system has significant advantages over the hardware it emulates: changes in logic, layout, and function are quickly and easily incorporated; data storage, retrieval, and processing are flexible and simply accomplished; physical components subject to wear and degradation are minimized. In addition, the system can be expanded to multiplex control of several power supplies, each with its own database, through a single computer console

  6. Computer control of the high-voltage power supply for the DIII-D Electron Cyclotron Heating system

    International Nuclear Information System (INIS)

    Clow, D.D.; Kellman, D.H.

    1991-10-01

    The D3-D Electron Cyclotron Heating (ECH) high voltage power supply is controlled by a computer. Operational control is input via keyboard and mouse, and computer/power supply interface is accomplished with a Computer Assisted Monitoring and Control (CAMAC) system. User-friendly tools allow the design and layout of simulated control panels on the computer screen. Panel controls and indicators can be changed, added or deleted, and simple editing of user-specific processes can quickly modify control and fault logic. Databases can be defined, and control panel functions are easily referred to various data channels. User-specific processes are written and linked using Fortran, to manage control and data acquisition through CAMAC. The resulting control system has significant advantages over the hardware it emulates: changes in logic, layout, and function are quickly and easily incorporated; data storage, retrieval, and processing are flexible and simply accomplished, physical components subject to wear and degradation are minimized. In addition, the system can be expanded to multiplex control of several power supplied, each with its own database, through a single computer and console. 5 refs., 4 figs., 1 tab

  7. Computational micromechanics analysis of electron hopping and interfacial damage induced piezoresistive response in carbon nanotube-polymer nanocomposites

    International Nuclear Information System (INIS)

    Chaurasia, A K; Seidel, G D; Ren, X

    2014-01-01

    Carbon nanotube (CNT)-polymer nanocomposites have been observed to exhibit an effective macroscale piezoresistive response, i.e., change in macroscale resistivity when subjected to applied deformation. The macroscale piezoresistive response of CNT-polymer nanocomposites leads to deformation/strain sensing capabilities. It is believed that the nanoscale phenomenon of electron hopping is the major driving force behind the observed macroscale piezoresistivity of such nanocomposites. Additionally, CNT-polymer nanocomposites provide damage sensing capabilities because of local changes in electron hopping pathways at the nanoscale because of initiation/evolution of damage. The primary focus of the current work is to explore the effect of interfacial separation and damage at the nanoscale CNT-polymer interface on the effective macroscale piezoresistive response. Interfacial separation and damage are allowed to evolve at the CNT-polymer interface through coupled electromechanical cohesive zones, within a finite element based computational micromechanics framework, resulting in electron hopping based current density across the separated CNT-polymer interface. The macroscale effective material properties and gauge factors are evaluated using micromechanics techniques based on electrostatic energy equivalence. The impact of the electron hopping mechanism, nanoscale interface separation and damage evolution on the effective nanocomposite electrostatic and piezoresistive response is studied in comparison with the perfectly bonded interface. The effective electrostatic/piezoresistive response for the perfectly bonded interface is obtained based on a computational micromechanics model developed in the authors’ earlier work. It is observed that the macroscale effective gauge factors are highly sensitive to strain induced formation/disruption of electron hopping pathways, interface separation and the initiation/evolution of interfacial damage. (paper)

  8. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  9. Ancient Biomolecules and Evolutionary Inference.

    Science.gov (United States)

    Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando; Welker, Frido; Pedersen, Mikkel Winther; Allentoft, Morten E; de Barros Damgaard, Peter; Gutenbrunner, Petra; Dunne, Julie; Hammann, Simon; Roffet-Salque, Mélanie; Ilardo, Melissa; Moreno-Mayar, J Víctor; Wang, Yucheng; Sikora, Martin; Vinner, Lasse; Cox, Jürgen; Evershed, Richard P; Willerslev, Eske

    2018-04-25

    Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleotide and amino acid sequences, as well as lipid signatures, from progressively older samples, originating from geographic areas and depositional environments that, until recently, were regarded as hostile to long-term preservation of biomolecules. Sampling frequencies and the spatial and temporal scope of studies have also increased markedly, and with them the size and quality of the data sets generated. This progress has been made possible by continuous technical innovations in analytical methods, enhanced criteria for the selection of ancient samples, integrated experimental methods, and advanced computational approaches. Here, we discuss the history and current state of ancient biomolecule research, its applications to evolutionary inference, and future directions for this young and exciting field. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  13. EVOLUTIONARY FOUNDATIONS FOR MOLECULAR MEDICINE

    Science.gov (United States)

    Nesse, Randolph M.; Ganten, Detlev; Gregory, T. Ryan; Omenn, Gilbert S.

    2015-01-01

    Evolution has long provided a foundation for population genetics, but many major advances in evolutionary biology from the 20th century are only now being applied in molecular medicine. They include the distinction between proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are further transforming evolutionary biology and creating yet more opportunities for progress at the interface of evolution with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and others to speed the development of evolutionary molecular medicine. PMID:22544168

  14. Parallel, distributed and GPU computing technologies in single-particle electron microscopy

    International Nuclear Information System (INIS)

    Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-01-01

    An introduction to the current paradigm shift towards concurrency in software. Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined

  15. Electronics

    Science.gov (United States)

    2001-01-01

    International Acer Incorporated, Hsin Chu, Taiwan Aerospace Industrial Development Corporation, Taichung, Taiwan American Institute of Taiwan, Taipei, Taiwan...Singapore and Malaysia .5 - 4 - The largest market for semiconductor products is the high technology consumer electronics industry that consumes up...Singapore, and Malaysia . A new semiconductor facility costs around $3 billion to build and takes about two years to become operational

  16. Evolutionary trends in Heteroptera

    NARCIS (Netherlands)

    Cobben, R.H.

    1968-01-01

    1. This work, the first volume of a series dealing with evolutionary trends in Heteroptera, is concerned with the egg system of about 400 species. The data are presented systematically in chapters 1 and 2 with a critical review of the literature after each family.

    2. Chapter 3 evaluates facts

  17. Evolutionary mysteries in meiosis

    NARCIS (Netherlands)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E.; Wijnker, Erik; Haag, Christoph R.

    2016-01-01

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these

  18. Evolutionary perspectives on ageing.

    Science.gov (United States)

    Reichard, Martin

    2017-10-01

    From an evolutionary perspective, ageing is a decrease in fitness with chronological age - expressed by an increase in mortality risk and/or decline in reproductive success and mediated by deterioration of functional performance. While this makes ageing intuitively paradoxical - detrimental to individual fitness - evolutionary theory offers answers as to why ageing has evolved. In this review, I first briefly examine the classic evolutionary theories of ageing and their empirical tests, and highlight recent findings that have advanced our understanding of the evolution of ageing (condition-dependent survival, positive pleiotropy). I then provide an overview of recent theoretical extensions and modifications that accommodate those new discoveries. I discuss the role of indeterminate (asymptotic) growth for lifetime increases in fecundity and ageing trajectories. I outline alternative views that challenge a universal existence of senescence - namely the lack of a germ-soma distinction and the ability of tissue replacement and retrogression to younger developmental stages in modular organisms. I argue that rejuvenation at the organismal level is plausible, but includes a return to a simple developmental stage. This may exempt a particular genotype from somatic defects but, correspondingly, removes any information acquired during development. A resolution of the question of whether a rejuvenated individual is the same entity is central to the recognition of whether current evolutionary theories of ageing, with their extensions and modifications, can explain the patterns of ageing across the Tree of Life. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Editorial overview: Evolutionary psychology

    NARCIS (Netherlands)

    Gangestad, S.W.; Tybur, J.M.

    2016-01-01

    Functional approaches in psychology - which ask what behavior is good for - are almost as old as scientific psychology itself. Yet sophisticated, generative functional theories were not possible until developments in evolutionary biology in the mid-20th century. Arising in the last three decades,

  20. Biochemistry and evolutionary biology

    Indian Academy of Sciences (India)

    Biochemical information has been crucial for the development of evolutionary biology. On the one hand, the sequence information now appearing is producing a huge increase in the amount of data available for phylogenetic analysis; on the other hand, and perhaps more fundamentally, it allows understanding of the ...

  1. Evolutionary Biology Today

    Indian Academy of Sciences (India)

    Hindi and English. Port 1. Resonance, Vo1.7 ... they use. Of course, many evolutionary biologists do work with fossils or DNA, or both, but there are also large numbers of ... The first major division that I like to make is between studies focussed ...

  2. Learning: An Evolutionary Analysis

    Science.gov (United States)

    Swann, Joanna

    2009-01-01

    This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…

  3. Complex systems, evolutionary planning?

    NARCIS (Netherlands)

    Bertolini, L.; de Roo, G.; Silva, E.A.

    2010-01-01

    Coping with uncertainty is a defining challenge for spatial planners. Accordingly, most spatial planning theories and methods are aimed at reducing uncertainty. However, the question is what should be done when this seems impossible? This chapter proposes an evolutionary interpretation of spatial

  4. Molluscan Evolutionary Development

    DEFF Research Database (Denmark)

    Wanninger, Andreas Wilhelm Georg; Koop, Damien; Moshel-Lynch, Sharon

    2008-01-01

    Brought together by Winston F. Ponder and David R. Lindberg, thirty-six experts on the evolution of the Mollusca provide an up-to-date review of its evolutionary history. The Mollusca are the second largest animal phylum and boast a fossil record of over 540 million years. They exhibit remarkable...

  5. Effects of surface functionalization on the electronic and structural properties of carbon nanotubes: A computational approach

    Science.gov (United States)

    Ribeiro, M. S.; Pascoini, A. L.; Knupp, W. G.; Camps, I.

    2017-12-01

    Carbon nanotubes (CNTs) have important electronic, mechanical and optical properties. These features may be different when comparing a pristine nanotube with other presenting its surface functionalized. These changes can be explored in areas of research and application, such as construction of nanodevices that act as sensors and filters. Following this idea, in the current work, we present the results from a systematic study of CNT's surface functionalized with hydroxyl and carboxyl groups. Using the entropy as selection criterion, we filtered a library of 10k stochastically generated complexes for each functional concentration (5, 10, 15, 20 and 25%). The structurally related parameters (root-mean-square deviation, entropy, and volume/area) have a monotonic relationship with functionalization concentration. Differently, the electronic parameters (frontier molecular orbital energies, electronic gap, molecular hardness, and electrophilicity index) present and oscillatory behavior. For a set of concentrations, the nanotubes present spin polarized properties that can be used in spintronics.

  6. PENELOPE, and algorithm and computer code for Monte Carlo simulation of electron-photon showers

    Energy Technology Data Exchange (ETDEWEB)

    Salvat, F.; Fernandez-Varea, J.M.; Baro, J.; Sempau, J.

    1996-10-01

    The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from similar{sub t}o 1 KeV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm.

  7. PENELOPE, an algorithm and computer code for Monte Carlo simulation of electron-photon showers

    Energy Technology Data Exchange (ETDEWEB)

    Salvat, F; Fernandez-Varea, J M; Baro, J; Sempau, J

    1996-07-01

    The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from 1 keV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. (Author) 108 refs.

  8. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks; Diseno de una herramienta de computo para la espectrometria y dosimetria de neutrones por medio de redes neuronales evolutivas

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Martinez B, M. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Gallego, E. [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, Jose Gutierrez Abascal No. 2, E-28006 Madrid (Spain)], e-mail: morvymmyahoo@com.mx

    2009-10-15

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  9. Computing the Density Matrix in Electronic Structure Theory on Graphics Processing Units.

    Science.gov (United States)

    Cawkwell, M J; Sanville, E J; Mniszewski, S M; Niklasson, Anders M N

    2012-11-13

    The self-consistent solution of a Schrödinger-like equation for the density matrix is a critical and computationally demanding step in quantum-based models of interatomic bonding. This step was tackled historically via the diagonalization of the Hamiltonian. We have investigated the performance and accuracy of the second-order spectral projection (SP2) algorithm for the computation of the density matrix via a recursive expansion of the Fermi operator in a series of generalized matrix-matrix multiplications. We demonstrate that owing to its simplicity, the SP2 algorithm [Niklasson, A. M. N. Phys. Rev. B2002, 66, 155115] is exceptionally well suited to implementation on graphics processing units (GPUs). The performance in double and single precision arithmetic of a hybrid GPU/central processing unit (CPU) and full GPU implementation of the SP2 algorithm exceed those of a CPU-only implementation of the SP2 algorithm and traditional matrix diagonalization when the dimensions of the matrices exceed about 2000 × 2000. Padding schemes for arrays allocated in the GPU memory that optimize the performance of the CUBLAS implementations of the level 3 BLAS DGEMM and SGEMM subroutines for generalized matrix-matrix multiplications are described in detail. The analysis of the relative performance of the hybrid CPU/GPU and full GPU implementations indicate that the transfer of arrays between the GPU and CPU constitutes only a small fraction of the total computation time. The errors measured in the self-consistent density matrices computed using the SP2 algorithm are generally smaller than those measured in matrices computed via diagonalization. Furthermore, the errors in the density matrices computed using the SP2 algorithm do not exhibit any dependence of system size, whereas the errors increase linearly with the number of orbitals when diagonalization is employed.

  10. Data processing of X-ray fluorescence analysis using an electronic computer

    International Nuclear Information System (INIS)

    Yakubovich, A.L.; Przhiyalovskij, S.M.; Tsameryan, G.N.; Golubnichij, G.V.; Nikitin, S.A.

    1979-01-01

    Considered are problems of data processing of multi-element (for 17 elements) X-ray fluorescence analysis of tungsten and molybdenum ores. The analysis was carried out using silicon-lithium spectrometer with the energy resolution of about 300 eV and a 1024-channel analyzer. A characteristic radiation of elements was excited with two 109 Cd radioisotope sources, their general activity being 10 mCi. The period of measurements was 400 s. The data obtained were processed with a computer using the ''Proba-1'' and ''Proba-2'' programs. Data processing algorithms and computer calculation results are presented

  11. Efficient method for computing the electronic transport properties of a multiterminal system

    Science.gov (United States)

    Lima, Leandro R. F.; Dusko, Amintor; Lewenkopf, Caio

    2018-04-01

    We present a multiprobe recursive Green's function method to compute the transport properties of mesoscopic systems using the Landauer-Büttiker approach. By introducing an adaptive partition scheme, we map the multiprobe problem into the standard two-probe recursive Green's function method. We apply the method to compute the longitudinal and Hall resistances of a disordered graphene sample, a system of current interest. We show that the performance and accuracy of our method compares very well with other state-of-the-art schemes.

  12. Analogue alternative the electronic analogue computer in Britain and the USA, 1930-1975

    CERN Document Server

    Small, James S

    2013-01-01

    We are in the midst of a digital revolution - until recently, the majority of appliances used in everyday life have been developed with analogue technology. Now, either at home or out and about, we are surrounded by digital technology such as digital 'film', audio systems, computers and telephones. From the late 1940s until the 1970s, analogue technology was a genuine alternative to digital, and the two competing technologies ran parallel with each other. During this period, a community of engineers, scientists, academics and businessmen continued to develop and promote the analogue computer.

  13. Context dependent DNA evolutionary models

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    This paper is about stochastic models for the evolution of DNA. For a set of aligned DNA sequences, connected in a phylogenetic tree, the models should be able to explain - in probabilistic terms - the differences seen in the sequences. From the estimates of the parameters in the model one can...... start to make biologically interpretations and conclusions concerning the evolutionary forces at work. In parallel with the increase in computing power, models have become more complex. Starting with Markov processes on a space with 4 states, and extended to Markov processes with 64 states, we are today...... studying models on spaces with 4n (or 64n) number of states with n well above one hundred, say. For such models it is no longer possible to calculate the transition probability analytically, and often Markov chain Monte Carlo is used in connection with likelihood analysis. This is also the approach taken...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. Nonlinear excitation of electron cyclotron waves by a monochromatic strong microwave: computer simulation analysis of the MINIX results

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, H.; Kimura, T.

    1986-01-01

    Triggered by the experimental results of the MINIX, a computer simulation study was initiated on the nonlinear excitation of electrostatic electron cyclotron waves by a monochromatic electromagnetic wave such as the transmitted microwave in the MINIX. The model used assumes that both of the excited waves and exciting (pumping) electromagnetic wave as well as the idler electromagnetic wave propagate in the direction perpendicular to the external magnetic field. The simulation code used for this study was the one-and-two-half dimensional electromagnetic particle code named KEMPO. The simulation result shows the high power electromagnetic wave produces both the backscattered electromagnetic wave and electrostatic electron cyclotron waves as a result of nonlinear parametric instability. Detailed nonlinear microphysics related to the wave excitation is discussed in terms of the nonlinear wave-wave couplings and associated ponderomotive force produced by the high power electromagnetic waves. 2 references, 4 figures.

  16. Nonlinear excitation of electron cyclotron waves by a monochromatic strong microwave: computer simulation analysis of the MINIX results

    International Nuclear Information System (INIS)

    Matsumoto, H.; Kimura, T.

    1986-01-01

    Triggered by the experimental results of the MINIX, a computer simulation study was initiated on the nonlinear excitation of electrostatic electron cyclotron waves by a monochromatic electromagnetic wave such as the transmitted microwave in the MINIX. The model used assumes that both of the excited waves and exciting (pumping) electromagnetic wave as well as the idler electromagnetic wave propagate in the direction perpendicular to the external magnetic field. The simulation code used for this study was the one-and-two-half dimensional electromagnetic particle code named KEMPO. The simulation result shows the high power electromagnetic wave produces both the backscattered electromagnetic wave and electrostatic electron cyclotron waves as a result of nonlinear parametric instability. Detailed nonlinear microphysics related to the wave excitation is discussed in terms of the nonlinear wave-wave couplings and associated ponderomotive force produced by the high power electromagnetic waves. 2 references, 4 figures

  17. New method of computing the contributions of graphs without lepton loops to the electron anomalous magnetic moment in QED

    Science.gov (United States)

    Volkov, Sergey

    2017-11-01

    This paper presents a new method of numerical computation of the mass-independent QED contributions to the electron anomalous magnetic moment which arise from Feynman graphs without closed electron loops. The method is based on a forestlike subtraction formula that removes all ultraviolet and infrared divergences in each Feynman graph before integration in Feynman-parametric space. The integration is performed by an importance sampling Monte-Carlo algorithm with the probability density function that is constructed for each Feynman graph individually. The method is fully automated at any order of the perturbation series. The results of applying the method to 2-loop, 3-loop, 4-loop Feynman graphs, and to some individual 5-loop graphs are presented, as well as the comparison of this method with other ones with respect to Monte Carlo convergence speed.

  18. A far-infrared Michelson interferometer for tokamak electron density measurements using computer-generated reference fringes

    International Nuclear Information System (INIS)

    Krug, P.A.; Stimson, P.A.; Falconer, I.S.

    1986-01-01

    A simple far-infrared interferometer which uses the 394 μm laser line from optically-pumped formic acid vapour to measure tokamak electron density is described. This interferometer is unusual in requiring only one detector and a single probing beam since reference fringes during the plasma shot are obtained by computer interpolation between the fringes observed immediately before and after the shot. Electron density has been measured with a phase resolution corresponding to + - 1/20 wavelength fringe shift, which is equivalent to a central density resolution of + - 0.1 x 10 19 m -3 for an assumed parabolic density distribution in a plasma of diameter of 0.2 m, and with a time resolution of 0.2 ms. (author)

  19. Electronic transient processes and optical spectra in quantum dots for quantum computing

    Czech Academy of Sciences Publication Activity Database

    Král, Karel; Zdeněk, Petr; Khás, Zdeněk

    2004-01-01

    Roč. 3, č. 1 (2004), s. 17-25 ISSN 1536-125X R&D Projects: GA AV ČR IAA1010113 Institutional research plan: CEZ:AV0Z1010914 Keywords : depopulation * electronic relaxation * optical spectra * quantum dots * self-assembled quantum dots * upconversion Subject RIV: BE - Theoretical Physics Impact factor: 3.176, year: 2004

  20. 76 FR 22918 - In the Matter of Certain Handheld Electronic Computing Devices, Related Software, and Components...

    Science.gov (United States)

    2011-04-25

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-769] In the Matter of Certain Handheld Electronic.... International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that a complaint was filed with the U.S. International Trade Commission on March 21, 2011, under section 337 of the Tariff Act of 1930...

  1. Computer simulation of electron beams. II. Low-cost beam-current reconstruction

    International Nuclear Information System (INIS)

    de Wolf, D.A.

    1985-01-01

    Reconstruction of current density in electron beams is complicated by distortion of phase space which can require very fine discretization of the beam into trajectories. An efficient discretization of phase space is exploited, using conservation of charge and current in hypertriangle patches, to reconstruct the current density by fitting Gaussians through the distorted hypertriangles. Advantages and limitations are discussed

  2. Computationally efficient description of relativistic electron beam transport in dense plasma

    Science.gov (United States)

    Polomarov, Oleg; Sefkov, Adam; Kaganovich, Igor; Shvets, Gennady

    2006-10-01

    A reduced model of the Weibel instability and electron beam transport in dense plasma is developed. Beam electrons are modeled by macro-particles and the background plasma is represented by electron fluid. Conservation of generalized vorticity and quasineutrality of the plasma-beam system are used to simplify the governing equations. Our approach is motivated by the conditions of the FI scenario, where the beam density is likely to be much smaller than the plasma density and the beam energy is likely to be very high. For this case the growth rate of the Weibel instability is small, making the modeling of it by conventional PICs exceedingly time consuming. The present approach does not require resolving the plasma period and only resolves a plasma collisionless skin depth and is suitable for modeling a long-time behavior of beam-plasma interaction. An efficient code based on this reduced description is developed and benchmarked against the LSP PIC code. The dynamics of low and high current electron beams in dense plasma is simulated. Special emphasis is on peculiarities of its non-linear stages, such as filament formation and merger, saturation and post-saturation field and energy oscillations. *Supported by DOE Fusion Science through grant DE-FG02-05ER54840.

  3. Intravenous coronary angiography by electron beam computed tomography : a clinical evaluation

    NARCIS (Netherlands)

    Rensing, B J; Bongaerts, A; van Geuns, R J; van Ooijen, P; Oudkerk, M; de Feyter, P J

    1998-01-01

    BACKGROUND: -Noninvasive detection of coronary stenoses with electron beam CT (EBCT) after intravenous injection of contrast medium has recently emerged. We sought to determine the diagnostic accuracy of EBCT angiography in the clinical setting using conventional coronary angiography as the "gold

  4. Computations on injection into organics - or how to let electrons shine

    NARCIS (Netherlands)

    Uijttewaal, M.A.

    2007-01-01

    This thesis studies various aspects of electron injection into organic light-emitting diodes (OLEDs) using density functional theory and the master equation approach (only the last chapter). The first part of the thesis studies the relation between the work function and the surface stability of a

  5. Reconstruction and identification of electrons in the Atlas experiment. Setup of a Tier 2 of the computing grid

    International Nuclear Information System (INIS)

    Derue, F.

    2008-03-01

    The origin of the mass of elementary particles is linked to the electroweak symmetry breaking mechanism. Its study will be one of the main efforts of the Atlas experiment at the Large Hadron Collider of CERN, starting in 2008. In most cases, studies will be limited by our knowledge of the detector performances, as the precision of the energy reconstruction or the efficiency to identify particles. This manuscript presents a work dedicated to the reconstruction of electrons in the Atlas experiment with simulated data and data taken during the combined test beam of 2004. The analysis of the Atlas data implies the use of a huge amount of computing and storage resources which brought to the development of a world computing grid. (author)

  6. Computer-aided analysis of power-electronic systems simulation of a high-voltage power converter

    International Nuclear Information System (INIS)

    Bordry, F.; Isch, H.W.; Proudlock, P.

    1987-01-01

    In the study of semiconductor devices, simulation methods play an important role in both the design of systems and the analysis of their operation. The authors describe a new and efficient computer-aided package program for general power-electronic systems. The main difficulty when taking into account non-linear elements, such as semiconductors, lies in determining the existence and the relations of the elementary sequences defined by the conduction or nonconduction of these components. The method does not require a priori knowledge of the state sequences of the semiconductor nor of the commutation instants, but only the circuit structure, its parameters and the commands to the controlled switches. The simulation program computes automatically both transient and steady-state waveforms for any circuit configuration. The simulation of a high-voltage power converter is presented, both for its steady-state and transient overload conditions. This 100 kV power converter (4 MW) will feed two klystrons in parallel

  7. Computational Model of D-Region Ion Production Caused by Energetic Electron Precipitations Based on General Monte Carlo Transport Calculations

    Science.gov (United States)

    Kouznetsov, A.; Cully, C. M.

    2017-12-01

    During enhanced magnetic activities, large ejections of energetic electrons from radiation belts are deposited in the upper polar atmosphere where they play important roles in its physical and chemical processes, including VLF signals subionospheric propagation. Electron deposition can affect D-Region ionization, which are estimated based on ionization rates derived from energy depositions. We present a model of D-region ion production caused by an arbitrary (in energy and pitch angle) distribution of fast (10 keV - 1 MeV) electrons. The model relies on a set of pre-calculated results obtained using a general Monte Carlo approach with the latest version of the MCNP6 (Monte Carlo N-Particle) code for the explicit electron tracking in magnetic fields. By expressing those results using the ionization yield functions, the pre-calculated results are extended to cover arbitrary magnetic field inclinations and atmospheric density profiles, allowing ionization rate altitude profile computations in the range of 20 and 200 km at any geographic point of interest and date/time by adopting results from an external atmospheric density model (e.g. NRLMSISE-00). The pre-calculated MCNP6 results are stored in a CDF (Common Data Format) file, and IDL routines library is written to provide an end-user interface to the model.

  8. Using soft-X-ray energy spectrum to measure electronic temperature Te and primary research with computer data processing

    International Nuclear Information System (INIS)

    Wang Jingyao; Zhang Guangyang

    1993-01-01

    The authors reported the application of SCORPIO--2000 Computer detecting system on a nuclear fusion equipment, to measure the energy spectrum of soft X-ray from which the plasma electronic temperature was calculated. The authors processed systematically the data of the energy area of 1-4 Kev soft X-ray. The program edited was mostly made in FORTRAN, but only one SUBSB was made in assembly language. The program worked normally with convincing operation and easy correction of the data. The result obtained from calculation is the same as what was expected and the diagram obtained is the same as the expected one

  9. 2-D Low Energy Electron Beam Profile Measurement Based on Computer Tomography Algorithm with Multi-Wire Scanner

    CERN Document Server

    Yu, Nengjie; Li Qing Feng; Tang, Chuan-Xiang; Zheng, Shuxin

    2005-01-01

    A new method for low energy electron beam profile measurement is advanced, which presents a full 2-D beam profile distribution other than the traditional 2-D beam profile distribution given by 1-D vertical and horizontal beam profiles. The method is based on the CT (Computer Tomography) algorithm. Multi-sets of data about the 1-D beam profile projections are attained by rotating the multi-wire scanner. Then a 2-D beam profile is reconstructed from these projections with CT algorithm. The principle of this method is presented. The simulation and the experiment results are compared and analyzed in detail.

  10. Encoding neural and synaptic functionalities in electron spin: A pathway to efficient neuromorphic computing

    Science.gov (United States)

    Sengupta, Abhronil; Roy, Kaushik

    2017-12-01

    Present day computers expend orders of magnitude more computational resources to perform various cognitive and perception related tasks that humans routinely perform every day. This has recently resulted in a seismic shift in the field of computation where research efforts are being directed to develop a neurocomputer that attempts to mimic the human brain by nanoelectronic components and thereby harness its efficiency in recognition problems. Bridging the gap between neuroscience and nanoelectronics, this paper attempts to provide a review of the recent developments in the field of spintronic device based neuromorphic computing. Description of various spin-transfer torque mechanisms that can be potentially utilized for realizing device structures mimicking neural and synaptic functionalities is provided. A cross-layer perspective extending from the device to the circuit and system level is presented to envision the design of an All-Spin neuromorphic processor enabled with on-chip learning functionalities. Device-circuit-algorithm co-simulation framework calibrated to experimental results suggest that such All-Spin neuromorphic systems can potentially achieve almost two orders of magnitude energy improvement in comparison to state-of-the-art CMOS implementations.

  11. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime

    NARCIS (Netherlands)

    Baart, T.A.; Eendebak, P.T.; Reichl, C.; Wegscheider, W.; Vandersypen, L.M.K.

    2016-01-01

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the

  12. Electronic cleansing for computed tomography (CT) colonography using a scale-invariant three-material model

    NARCIS (Netherlands)

    Serlie, Iwo W. O.; Vos, Frans M.; Truyen, Roel; Post, Frits H.; Stoker, Jaap; van Vliet, Lucas J.

    2010-01-01

    A well-known reading pitfall in computed tomography (CT) colonography is posed by artifacts at T-junctions, i.e., locations where air-fluid levels interface with the colon wall. This paper presents a scale-invariant method to determine material fractions in voxels near such T-junctions. The proposed

  13. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Energy Technology Data Exchange (ETDEWEB)

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  14. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  15. Variations of dose distribution in high energy electron beams as a function of geometrical parameters of irradiation. Application to computer calculation

    International Nuclear Information System (INIS)

    Villeret, O.

    1985-04-01

    An algorithm is developed for the purpose of compter treatment planning of electron therapy. The method uses experimental absorbed dose distribution data in the irradiated medium for electron beams in the 8-20 MeV range delivered by the Sagittaire linear accelerator (study of central axis depth dose, beam profiles) in various geometrical conditions. Experimental verification of the computer program showed agreement with 2% between dose measurement and computer calculation [fr

  16. Evaluation of computational models and cross sections used by MCNP6 for simulation of electron backscattering

    Energy Technology Data Exchange (ETDEWEB)

    Poškus, Andrius, E-mail: andrius.poskus@ff.vu.lt

    2016-02-01

    This work evaluates the accuracy of the single-event (SE) and condensed-history (CH) models of electron transport in Monte Carlo simulations of electron backscattering from thick layers of Be, C, Al, Cu, Ag, Au and U at incident electron energies from 200 eV to 15 MeV. The CH method is used in simulations performed with MCNP6.1, and the SE method is used in simulations performed with an open-source single-event code MCNelectron written by the author of this paper. Both MCNP6.1 and MCNelectron use mainly ENDF/B-VI.8 library data, but MCNelectron allows replacing cross sections of certain types of interactions by alternative datasets from other sources. The SE method is evaluated both using only ENDF/B-VI.8 cross sections (the “SE-ENDF/B method”, which is equivalent to using MCNP6.1 in SE mode) and with an alternative set of elastic scattering cross sections obtained from relativistic (Dirac) partial-wave (DPW) calculations (the “SE-DPW method”). It is shown that at energies from 200 eV to 300 keV the estimates of the backscattering coefficients obtained using the SE-DPW method are typically within 10% of the experimental data, which is approximately the same accuracy that is achieved using MCNP6.1 in CH mode. At energies below 1 keV and above 300 keV, the SE-DPW method is much more accurate than the SE-ENDF/B method due to lack of angular distribution data in the ENDF/B library in those energy ranges. At energies from 500 keV to 15 MeV, the CH approximation is roughly twice more accurate than the SE-DPW method, with the average relative errors equal 7% and 14%, respectively. The energy probability density functions (PDFs) of backscattered electrons for Al and Cu, calculated using the SE method with DPW cross sections when energy of incident electrons is 20 keV, have an average absolute error as low as 4% of the average PDF. This error is approximately twice less than the error of the corresponding PDF calculated using the CH approximation. It is concluded

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. A computational study on the electronic and nonlinear optical properties of graphyne subunit

    Energy Technology Data Exchange (ETDEWEB)

    Bahat, Mehmet, E-mail: bahat@gazi.edu.tr; Güney, Merve Nurhan, E-mail: merveng87@gmail.com; Özbay, Akif, E-mail: aozbay@gazi.edu.tr [Department of Physics, Gazi University, Ankara, 06500 (Turkey)

    2016-03-25

    After discovery of graphene, it has been considered as basic material for the future nanoelectronic devices. Graphyne is a two- dimensional carbon allotropes as graphene which expected that its electronic properties is potentialy superior to graphene. The compound C{sub 24}H{sub 12} (tribenzocyclyne; TBC) is a substructure of graphyne. The electronic, and nonlinear optical properties of the C{sub 24}H{sub 12} and its some fluoro derivatives were calculated. The calculated properties are electric dipole moment, the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) energies, polarizability and first hyperpolarizability. All calculations were performed at the B3LYP/6-31+G(d,p) level.

  20. Software abstractions and computational issues in parallel structure adaptive mesh methods for electronic structure calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Weare, J.; Ong, E.; Baden, S.

    1997-05-01

    We have applied structured adaptive mesh refinement techniques to the solution of the LDA equations for electronic structure calculations. Local spatial refinement concentrates memory resources and numerical effort where it is most needed, near the atomic centers and in regions of rapidly varying charge density. The structured grid representation enables us to employ efficient iterative solver techniques such as conjugate gradient with FAC multigrid preconditioning. We have parallelized our solver using an object- oriented adaptive mesh refinement framework.

  1. Evolutionary design assistants for architecture

    Directory of Open Access Journals (Sweden)

    N. Onur Sönmez

    2015-04-01

    Full Text Available In its parallel pursuit of an increased competitivity for design offices and more pleasurable and easier workflows for designers, artificial design intelligence is a technical, intellectual, and political challenge. While human-machine cooperation has become commonplace through Computer Aided Design (CAD tools, a more improved collaboration and better support appear possible only through an endeavor into a kind of artificial design intelligence, which is more sensitive to the human perception of affairs. Considered as part of the broader Computational Design studies, the research program of this quest can be called Artificial / Autonomous / Automated Design (AD. The current available level of Artificial Intelligence (AI for design is limited and a viable aim for current AD would be to develop design assistants that are capable of producing drafts for various design tasks. Thus, the overall aim of this thesis is the development of approaches, techniques, and tools towards artificial design assistants that offer a capability for generating drafts for sub-tasks within design processes. The main technology explored for this aim is Evolutionary Computation (EC, and the target design domain is architecture. The two connected research questions of the study concern, first, the investigation of the ways to develop an architectural design assistant, and secondly, the utilization of EC for the development of such assistants. While developing approaches, techniques, and computational tools for such an assistant, the study also carries out a broad theoretical investigation into the main problems, challenges, and requirements towards such assistants on a rather overall level. Therefore, the research is shaped as a parallel investigation of three main threads interwoven along several levels, moving from a more general level to specific applications. The three research threads comprise, first, theoretical discussions and speculations with regard to both

  2. Evolutionary constrained optimization

    CERN Document Server

    Deb, Kalyanmoy

    2015-01-01

    This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...

  3. Computer experiments on the imaging of the (111) split crowdion interstitial in tungsten by transmission electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Krakow, W [Xerox Corp., Rochester, N.Y. (USA)

    1978-06-01

    Computer simulated dark-field electron micrographs at atomic resolutions have been generated by calculating the diffuse elastic scattering distribution of short range order objects with the important point being that images are formed from regions of reciprocal space that do not contain Bragg reflections of the perfect crystal. Interpretation of these images in terms of atom positions and atom correlations becomes straightforward and it is a simple matter to distinguish between real structural information and image artifacts produced by the phase contrast mechanism in the electron optical imaging process. In this paper images were calculated under a variety of microscope conditions for a (111) split crowdion interstitial in tungsten which included up to 182 atoms of the surrounding strain field. The effect of specimen orientation, microscope objective lens defocus and the contribution of atoms lying in different shells around the defect have been considered. To aid in image interpretation accompanying diffraction patterns have been computed for different specimen orientations which show either the perfect crystal Bragg diffraction pattern or the diffuse scattering distribution produced by the crowdion defect.

  4. Evolutionary games on graphs

    Science.gov (United States)

    Szabó, György; Fáth, Gábor

    2007-07-01

    Game theory is one of the key paradigms behind many scientific disciplines from biology to behavioral sciences to economics. In its evolutionary form and especially when the interacting agents are linked in a specific social network the underlying solution concepts and methods are very similar to those applied in non-equilibrium statistical physics. This review gives a tutorial-type overview of the field for physicists. The first four sections introduce the necessary background in classical and evolutionary game theory from the basic definitions to the most important results. The fifth section surveys the topological complications implied by non-mean-field-type social network structures in general. The next three sections discuss in detail the dynamic behavior of three prominent classes of models: the Prisoner's Dilemma, the Rock-Scissors-Paper game, and Competing Associations. The major theme of the review is in what sense and how the graph structure of interactions can modify and enrich the picture of long term behavioral patterns emerging in evolutionary games.

  5. Evolutionary mysteries in meiosis.

    Science.gov (United States)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E; Wijnker, Erik; Haag, Christoph R

    2016-10-19

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these often 'weird' features. We discuss the origin of meiosis (origin of ploidy reduction and recombination, two-step meiosis), its secondary modifications (in polyploids or asexuals, inverted meiosis), its importance in punctuating life cycles (meiotic arrests, epigenetic resetting, meiotic asymmetry, meiotic fairness) and features associated with recombination (disjunction constraints, heterochiasmy, crossover interference and hotspots). We present the various evolutionary scenarios and selective pressures that have been proposed to account for these features, and we highlight that their evolutionary significance often remains largely mysterious. Resolving these mysteries will likely provide decisive steps towards understanding why sex and recombination are found in the majority of eukaryotes.This article is part of the themed issue 'Weird sex: the underappreciated diversity of sexual reproduction'. © 2016 The Author(s).

  6. Asymmetric Evolutionary Games

    Science.gov (United States)

    McAvoy, Alex; Hauert, Christoph

    2015-01-01

    Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner’s Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games. PMID:26308326

  7. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  14. The role of electrostatics in TrxR electron transfer mechanism: A computational approach.

    Science.gov (United States)

    Teixeira, Vitor H; Capacho, Ana Sofia C; Machuqueiro, Miguel

    2016-12-01

    Thioredoxin reductase (TrxR) is an important enzyme in the control of the intracellular reduced redox environment. It transfers electrons from NADPH to several molecules, including its natural partner, thioredoxin. Although there is a generally accepted model describing how the electrons are transferred along TrxR, which involves a flexible arm working as a "shuttle," the molecular details of such mechanism are not completely understood. In this work, we use molecular dynamics simulations with Poisson-Boltzmann/Monte Carlo pKa calculations to investigate the role of electrostatics in the electron transfer mechanism. We observed that the combination of redox/protonation states of the N-terminal (FAD and Cys59/64) and C-terminal (Cys497/Selenocysteine498) redox centers defines the preferred relative positions and allows for the flexible arm to work as the desired "shuttle." Changing the redox/ionization states of those key players, leads to electrostatic triggers pushing the arm into the pocket when oxidized, and pulling it out, once it has been reduced. The calculated pKa values for Cys497 and Selenocysteine498 are 9.7 and 5.8, respectively, confirming that the selenocysteine is indeed deprotonated at physiological pH. This can be an important advantage in terms of reactivity (thiolate/selenolate are more nucleophilic than thiol/selenol) and ability to work as an electrostatic trigger (the "shuttle" mechanism) and may be the reason why TrxR uses selenium instead of sulfur. Proteins 2016; 84:1836-1843. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. A computer code to calculate the fast induced signals by electron swarms in gases

    Energy Technology Data Exchange (ETDEWEB)

    Tobias, Carmen C.B. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Mangiarotti, Alessio [Universidade de Coimbra (Portugal). Dept. de Fisica. Lab. de Instrumentacao e Fisica Experimental de Particulas

    2010-07-01

    Full text: The study of electron transport parameters (i.e. drift velocity, diffusion coefficients and first Townsend coefficient) in gases is very important in several areas of applied nuclear science. For example, they are a relevant input to the design of particle detector employing micro-structures (MSGC's, micromegas, GEM's) and RPC's (resistive plate chambers). Moreover, if the data are accurate and complete enough, they can be used to derive a set of electron impact cross-sections with their energy dependence, that are a key ingredient in micro-dosimetry calculations. Despite the fundamental need of such data and the long age of the field, the gases of possible interest are so many and the effort of obtaining good quality data so time demanding, that an important contribution can still be made. As an example, electrons drift velocity at moderate field strengths (up to 50 Td) in pure Isobutane (a tissue equivalent gas) has been measured only recently by the IPEN-LIP collaboration using a dedicated setup. The transport parameters are derived from the recorded electric pulse induced by a swarm started with a pulsed laser shining on the cathode. To aid the data analysis, a special code has been developed to calculate the induced pulse by solving the electrons continuity equation including growth, drift and diffusion. A realistic profile of the initial laser beam is taken into account as well as the boundary conditions at the cathode and anode. The approach is either semi-analytic, based on the expression derived by P. H. Purdie and J. Fletcher, or fully numerical, using a finite difference scheme improved over the one introduced by J. de Urquijo et al. The agreement between the two will be demonstrated under typical conditions for the mentioned experimental setup. A brief discussion on the stability of the finite difference scheme will be given. The new finite difference scheme allows a detailed investigation of the importance of back diffusion to

  16. Computational simulation of electron and ion beams interaction with solid high-molecular dielectrics and inorganic glasses

    International Nuclear Information System (INIS)

    Milyavskiy, V.V.

    1998-01-01

    Numerical investigation of interaction of electron beams (with the energy within the limits 100 keV--20 MeV) and ion beams (with the energy over the range 1 keV--50 MeV) with solid high-molecular dielectrics and inorganic glasses is performed. Note that the problem of interaction of electron beams with glass optical covers is especially interesting in connection with the problem of radiation protection of solar power elements on cosmic satellites and stations. For computational simulation of the above-mentioned processes a mathematical model was developed, describing the propagation of particle beams through the sample thickness, the accumulation and relaxation of volume charge and shock-wave processes, as well as the evolution of electric field in the sample. The calculation of energy deposition by electron beam in a target in the presence of nonuniform electric field was calculated with the assistance of the semiempirical procedure, formerly proposed by author of this work. Propagation of the low energy ions through the sample thickness was simulated using Pearson IV distribution. Damage distribution, ionization distribution and range distribution was taken into account. Propagation of high energy ions was calculated in the approximation of continuous deceleration. For description of hydrodynamic processes the system of equations of continuum mechanics in elastic-plastic approximation and the wide-range equation of state were used

  17. Organic molecules deposited on graphene: A computational investigation of self-assembly and electronic structure

    International Nuclear Information System (INIS)

    Oliveira, I. S. S. de; Miwa, R. H.

    2015-01-01

    We use ab initio simulations to investigate the adsorption and the self-assembly processes of tetracyanoquinodimethane (TCNQ), tetrafluoro-tetracyanoquinodimethane (F4-TCNQ), and tetrasodium 1,3,6,8-pyrenetetrasulfonic acid (TPA) on the graphene surface. We find that there are no chemical bonds at the molecule–graphene interface, even at the presence of grain boundaries on the graphene surface. The molecules bond to graphene through van der Waals interactions. In addition to the molecule–graphene interaction, we performed a detailed study of the role played by the (lateral) molecule–molecule interaction in the formation of the, experimentally verified, self-assembled layers of TCNQ and TPA on graphene. Regarding the electronic properties, we calculate the electronic charge transfer from the graphene sheet to the TCNQ and F4-TCNQ molecules, leading to a p-doping of graphene. Meanwhile, such charge transfer is reduced by an order of magnitude for TPA molecules on graphene. In this case, it is not expected a significant doping process upon the formation of self-assembled layer of TPA molecules on the graphene sheet

  18. Electronic and magnetic properties of BNC nanoribbons: a detailed computational study

    International Nuclear Information System (INIS)

    Basheer, Ershaad Ahamed; Parida, Prakash; Pati, Swapan K

    2011-01-01

    Using density functional theory (DFT), we perform a systematic study of the electronic structure of zigzag edge BNC nanoribbons, which have an equal number of boron, carbon and nitrogen atoms. We study two nanoribbon structures. One of them is terminated by carbon and nitrogen atoms on opposite edges, whereas the other is terminated by carbon and boron atoms on opposite edges. We explore the effect of passivation of the edge atoms on the electronic and magnetic properties of the nanoribbons. We also evaluate the changes in these effects brought about by varying the width of the nanoribbons. Our results show that, for ribbons of small width, the ones with a boron edge show semiconducting behaviour regardless of the nature of edge passivation, whereas nitrogen-edged nanoribbons display a range of conduction properties including half-metallic, metallic and semiconducting properties depending on the nature of edge passivation. On the other hand, ribbons of larger width show metallic behaviour. We also study the effect of external electric fields on the band structure of both boron-edged and nitrogen-edged nanoribbons and the trends in these effects with varying width. We find that both boron- and nitrogen-edged nanoribbons retain their zero-field conduction properties even in the presence of an electric field directed from the boron/nitrogen edge to the carbon edge. Our transport study of hydrogen-passivated carbon- and nitrogen-edged zigzag BNC nanoribbons reveals strong spin-filter properties.

  19. Bravyi-Kitaev Superfast simulation of electronic structure on a quantum computer.

    Science.gov (United States)

    Setia, Kanav; Whitfield, James D

    2018-04-28

    Present quantum computers often work with distinguishable qubits as their computational units. In order to simulate indistinguishable fermionic particles, it is first required to map the fermionic state to the state of the qubits. The Bravyi-Kitaev Superfast (BKSF) algorithm can be used to accomplish this mapping. The BKSF mapping has connections to quantum error correction and opens the door to new ways of understanding fermionic simulation in a topological context. Here, we present the first detailed exposition of the BKSF algorithm for molecular simulation. We provide the BKSF transformed qubit operators and report on our implementation of the BKSF fermion-to-qubits transform in OpenFermion. In this initial study of a hydrogen molecule we have compared BKSF, Jordan-Wigner, and Bravyi-Kitaev transforms under the Trotter approximation. The gate count to implement BKSF is lower than Jordan-Wigner but higher than Bravyi-Kitaev. We considered different orderings of the exponentiated terms and found lower Trotter errors than the previously reported for Jordan-Wigner and Bravyi-Kitaev algorithms. These results open the door to the further study of the BKSF algorithm for quantum simulation.

  20. The Evolutionary Origins of Hierarchy.

    Science.gov (United States)

    Mengistu, Henok; Huizinga, Joost; Mouret, Jean-Baptiste; Clune, Jeff

    2016-06-01

    Hierarchical organization-the recursive composition of sub-modules-is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force-the cost of connections-promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

  1. The Evolutionary Origins of Hierarchy

    Science.gov (United States)

    Huizinga, Joost; Clune, Jeff

    2016-01-01

    Hierarchical organization—the recursive composition of sub-modules—is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force–the cost of connections–promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics. PMID:27280881

  2. The Evolutionary Origins of Hierarchy.

    Directory of Open Access Journals (Sweden)

    Henok Mengistu

    2016-06-01

    Full Text Available Hierarchical organization-the recursive composition of sub-modules-is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments. Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force-the cost of connections-promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

  3. Electronic stopping power calculation for water under the Lindhard formalism for application in proton computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, A. F., E-mail: afguerreror@uqvirtual.edu.co [Departamento de Física, Universidad Del Quindío Cra 15 # 12N Armenia, Quindío (Colombia); Mesa, J., E-mail: jmesa@ibb.unesp.br [Instituto de Biociências de Botucatu da UNESP Distrito de Rubião Jr. s/n°, 18618-000, Botucatu, SP (Brazil)

    2016-07-07

    Because of the behavior that charged particles have when they interact with biological material, proton therapy is shaping the future of radiation therapy in cancer treatment. The planning of radiation therapy is made up of several stages. The first one is the diagnostic image, in which you have an idea of the density, size and type of tumor being treated; to understand this it is important to know how the particles beam interacts with the tissue. In this work, by using de Lindhard formalism and the Y.R. Waghmare model for the charge distribution of the proton, the electronic stopping power (SP) for a proton beam interacting with a liquid water target in the range of proton energies 10{sup 1} eV - 10{sup 10} eV taking into account all the charge states is calculated.

  4. Vibrational and electronic spectra of 2-nitrobenzanthrone: An experimental and computational study

    Science.gov (United States)

    Onchoke, Kefa K.; Chaudhry, Saad N.; Ojeda, Jorge J.

    2016-01-01

    The environmental pollutant 2-nitrobenzanthrone (2-NBA) poses human health hazards, and is formed by atmospheric reactions of NOX gases with atmospheric particulates. Though its mutagenic effects have been studied in biological systems, its comprehensive spectroscopic experimental data are scarce. Thus, vibrational and optical spectroscopic analysis (UV-Vis, and fluorescence) of 2-NBA was studied using both experimental and density functional theory employing B3LYP method with 6-311 + G(d,p) basis set. The scaled theoretical vibrational frequencies show good agreement to experiment to within 5 cm- 1 and NBA, respectively. On the basis of normal coordinate analysis complete assignments of harmonic experimental infrared and Raman bands are made. The influence of the nitro group substitution upon the benzanthrone structure and symmetric CH vibrations, and electronic spectra is noted. This study is useful for the development of spectroscopy-mutagenicity relationships in nitrated polycyclic aromatic hydrocarbons.

  5. Computation of electron-impact K-shell ionization cross sections of atoms

    International Nuclear Information System (INIS)

    Uddin, M.A.; Haque, A.K.F.; Billah, M. Masum; Basak, A.K.; Karim, K.R.; Saha, B.C.

    2005-01-01

    The total cross sections of electron impact single K-shell ionization of atomic targets, with a wide range of atomic numbers from Z=6-50, are evaluated in the energy range up to about 10 MeV employing the recently proposed modified version of the improved binary-encounter dipole (RQIBED) model [Uddin et al., Phys. Rev. A 70, 032706 (2004)], which incorporates the ionic and relativistic effects. The experimental cross sections for all targets are reproduced satisfactorily even in the relativistic energies using fixed generic values of the two parameters in the RQIBED model. The relativistic effect is found to be significant in all targets except for C, being profound in Ag and Sn

  6. Computational design of molecules for dye sensitized solar cells and nano electronics

    DEFF Research Database (Denmark)

    Ørnsø, Kristian Baruël

    sensitized solar cell (DSSC) in terms of a loss-less level alignment quality. This scoring only takes into account a simplified absorption spectrum of the dye in combination with the alignment between the molecular levels, the semi-conductor conduction band edge and the redox mediator. To improve on this...... a molecular junction, is by controlling the junction geometry. This is achieved by designing a molecule with two sets of anchor groups, which bind to gold with significantly different strengths. Hence, it is proposed that the geometry can be controlled by chemical passivisation of one type of anchor group....... Using a simple computational model, this experimental hypothesis is verified and the change in conductance upon changing junction geometry is reproduced....

  7. Algorithm of calculation of multicomponent system eutectics using electronic digital computer

    International Nuclear Information System (INIS)

    Posypajko, V.I.; Stratilatov, B.V.; Pervikova, V.I.; Volkov, V.Ya.

    1975-01-01

    A computer algorithm is proposed for determining low-temperature equilibrium regions for existing phases. The algorithm has been used in calculating nonvariant parameters (temperatures of melting of eutectics and the concentrations of their components) for a series of trinary systems, among which are Ksub(long)Cl, WO 4 , SO 4 (x 1 =K 2 WO 4 ; x 2 =K 2 SO 4 ), Ag, Cd, Pbsub(long)Cl(x 1 =CdCl 2 , x 2 =PbCl 2 ); Ksub(long)F, Cl, I (x 1 =KF, x 2 =KI). The proposed method of calculating eutectics permits the planning of the subsequent experiment in determining the parameters of the eutectics of multicomponent systems and the forecasting of chemical interaction in such systems. The algorithm can be used in calculating systems containing any number of components

  8. FREE SOFTWARE IN ELECTRONIC LEARNING FUTURE TEACHERS OF MATHEMATICS, PHYSICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Vladyslav Ye. Velychko

    2016-05-01

    Full Text Available Popularity of the use of free software in the IT industry is much higher than its popular use in educational activities. Disadvantages of free software and problems of its implementation in the educational process is a limiting factor for its use in the education system, however, openness, accessibility and functionality are the main factors for the introduction of free software in the educational process. Nevertheless, for future teachers of mathematics, physics and informatics free software is designed as well as possible because of the specificity of its creation, and therefore, there is a question of the system analysis of the possibilities of using open source software in e-learning for future teachers of mathematics, physics and computer science.

  9. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  12. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  13. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  16. Modelling of pathologies of the nervous system by the example of computational and electronic models of elementary nervous systems

    Energy Technology Data Exchange (ETDEWEB)

    Shumilov, V. N., E-mail: vnshumilov@rambler.ru; Syryamkin, V. I., E-mail: maximus70sir@gmail.com; Syryamkin, M. V., E-mail: maximus70sir@gmail.com [National Research Tomsk State University, 634050, Tomsk, Lenin Avenue, 36 (Russian Federation)

    2015-11-17

    The paper puts forward principles of action of devices operating similarly to the nervous system and the brain of biological systems. We propose an alternative method of studying diseases of the nervous system, which may significantly influence prevention, medical treatment, or at least retardation of development of these diseases. This alternative is to use computational and electronic models of the nervous system. Within this approach, we represent the brain in the form of a huge electrical circuit composed of active units, namely, neuron-like units and connections between them. As a result, we created computational and electronic models of elementary nervous systems, which are based on the principles of functioning of biological nervous systems that we have put forward. Our models demonstrate reactions to external stimuli and their change similarly to the behavior of simplest biological organisms. The models possess the ability of self-training and retraining in real time without human intervention and switching operation/training modes. In our models, training and memorization take place constantly under the influence of stimuli on the organism. Training is without any interruption and switching operation modes. Training and formation of new reflexes occur by means of formation of new connections between excited neurons, between which formation of connections is physically possible. Connections are formed without external influence. They are formed under the influence of local causes. Connections are formed between outputs and inputs of two neurons, when the difference between output and input potentials of excited neurons exceeds a value sufficient to form a new connection. On these grounds, we suggest that the proposed principles truly reflect mechanisms of functioning of biological nervous systems and the brain. In order to confirm the correspondence of the proposed principles to biological nature, we carry out experiments for the study of processes of

  17. Modelling of pathologies of the nervous system by the example of computational and electronic models of elementary nervous systems

    International Nuclear Information System (INIS)

    Shumilov, V. N.; Syryamkin, V. I.; Syryamkin, M. V.

    2015-01-01

    The paper puts forward principles of action of devices operating similarly to the nervous system and the brain of biological systems. We propose an alternative method of studying diseases of the nervous system, which may significantly influence prevention, medical treatment, or at least retardation of development of these diseases. This alternative is to use computational and electronic models of the nervous system. Within this approach, we represent the brain in the form of a huge electrical circuit composed of active units, namely, neuron-like units and connections between them. As a result, we created computational and electronic models of elementary nervous systems, which are based on the principles of functioning of biological nervous systems that we have put forward. Our models demonstrate reactions to external stimuli and their change similarly to the behavior of simplest biological organisms. The models possess the ability of self-training and retraining in real time without human intervention and switching operation/training modes. In our models, training and memorization take place constantly under the influence of stimuli on the organism. Training is without any interruption and switching operation modes. Training and formation of new reflexes occur by means of formation of new connections between excited neurons, between which formation of connections is physically possible. Connections are formed without external influence. They are formed under the influence of local causes. Connections are formed between outputs and inputs of two neurons, when the difference between output and input potentials of excited neurons exceeds a value sufficient to form a new connection. On these grounds, we suggest that the proposed principles truly reflect mechanisms of functioning of biological nervous systems and the brain. In order to confirm the correspondence of the proposed principles to biological nature, we carry out experiments for the study of processes of

  18. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  20. Studies in evolutionary agroecology

    DEFF Research Database (Denmark)

    Wille, Wibke

    of population performance will increase in frequency. Yield, one of the fundamental agronomic variables, is not an individual, but a population characteristic. A farmer wants a high yield per hectare; he is not interested in the performance of individual plants. When individual selection and population...... of Evolutionary Agroecology that the highest yielding individuals do not necessarily perform best as a population. The investment of resources into strategies and structures increasing individual competitive ability carries a cost. If a whole population consists of individuals investing resources to compete...

  1. Towards Adaptive Evolutionary Architecture

    DEFF Research Database (Denmark)

    Bak, Sebastian HOlt; Rask, Nina; Risi, Sebastian

    2016-01-01

    This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how...... to the development of designs tailored to the individual preferences of inhabitants, changing the roles of architects and designers entirely. Architecture-as-it-could-be is a philosophical approach conducted through artistic methods to anticipate the technological futures of human-centered development within...

  2. Ab-initio Computation of the Electronic, transport, and Bulk Properties of Calcium Oxide.

    Science.gov (United States)

    Mbolle, Augustine; Banjara, Dipendra; Malozovsky, Yuriy; Franklin, Lashounda; Bagayoko, Diola

    We report results from ab-initio, self-consistent, local Density approximation (LDA) calculations of electronic and related properties of calcium oxide (CaO) in the rock salt structure. We employed the Ceperley and Alder LDA potential and the linear combination of atomic orbitals (LCAO) formalism. Our calculations are non-relativistic. We implemented the LCAO formalism following the Bagayoko, Zhao, and Williams (BZW) method, as enhanced by Ekuma and Franklin (BZW-EF). The BZW-EF method involves a methodical search for the optimal basis set that yields the absolute minima of the occupied energies, as required by density functional theory (DFT). Our calculated, indirect band gap of 6.91eV, from towards the L point, is in excellent agreement with experimental value of 6.93-7.7eV, at room temperature (RT). We have also calculated the total (DOS) and partial (pDOS) densities of states as well as the bulk modulus. Our calculated bulk modulus is in excellent agreement with experiment. Work funded in part by the US Department of Energy (DOE), National Nuclear Security Administration (NNSA) (Award No.DE-NA0002630), the National Science Foundation (NSF) (Award No, 1503226), LaSPACE, and LONI-SUBR.

  3. Electron and ion transport equations in computational weakly-ionized plasmadynamics

    International Nuclear Information System (INIS)

    Parent, Bernard; Macheret, Sergey O.; Shneider, Mikhail N.

    2014-01-01

    A new set of ion and electron transport equations is proposed to simulate steady or unsteady quasi-neutral or non-neutral multicomponent weakly-ionized plasmas through the drift–diffusion approximation. The proposed set of equations is advantaged over the conventional one by being considerably less stiff in quasi-neutral regions because it can be integrated in conjunction with a potential equation based on Ohm's law rather than Gauss's law. The present approach is advantaged over previous attempts at recasting the system by being applicable to plasmas with several types of positive ions and negative ions and by not requiring changes to the boundary conditions. Several test cases of plasmas enclosed by dielectrics and of glow discharges between electrodes show that the proposed equations yield the same solution as the standard equations but require 10 to 100 times fewer iterations to reach convergence whenever a quasi-neutral region forms. Further, several grid convergence studies indicate that the present approach exhibits a higher resolution (and hence requires fewer nodes to reach a given level of accuracy) when ambipolar diffusion is present. Because the proposed equations are not intrinsically linked to specific discretization or integration schemes and exhibit substantial advantages with no apparent disadvantage, they are generally recommended as a substitute to the fluid models in which the electric field is obtained from Gauss's law as long as the plasma remains weakly-ionized and unmagnetized

  4. Web-based computational chemistry education with CHARMMing III: Reduction potentials of electron transfer proteins.

    Directory of Open Access Journals (Sweden)

    B Scott Perrin

    2014-07-01

    Full Text Available A module for fast determination of reduction potentials, E°, of redox-active proteins has been implemented in the CHARMM INterface and Graphics (CHARMMing web portal (www.charmming.org. The free energy of reduction, which is proportional to E°, is composed of an intrinsic contribution due to the redox site and an environmental contribution due to the protein and solvent. Here, the intrinsic contribution is selected from a library of pre-calculated density functional theory values for each type of redox site and redox couple, while the environmental contribution is calculated from a crystal structure of the protein using Poisson-Boltzmann continuum electrostatics. An accompanying lesson demonstrates a calculation of E°. In this lesson, an ionizable residue in a [4Fe-4S]-protein that causes a pH-dependent E° is identified, and the E° of a mutant that would test the identification is predicted. This demonstration is valuable to both computational chemistry students and researchers interested in predicting sequence determinants of E° for mutagenesis.

  5. A backward Monte Carlo method for efficient computation of runaway probabilities in runaway electron simulation

    Science.gov (United States)

    Zhang, Guannan; Del-Castillo-Negrete, Diego

    2017-10-01

    Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.

  6. Analysis of electronic circuits using digital computers; L'analyse des circuits electroniques par les calculateurs numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Tapu, C [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-07-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [French] Differents programmes ont ete proposes pour l'etude des circuits electroniques a l'aide des calculateurs. On montre comment on peut utiliser le programme ECAP, mis au point par I. B. M., pour etudier le comportement d'un amplificateur operationnel, a differents points de vue: analyse en courant continu, courant alternatif et regime transitoire, optimalisation du gain en boucle ouverte, etude de la fiabilite. (auteur)

  7. Cardiovascular measurement and cardiac function analysis with electron beam computed tomography in health Chinese people (50 cases report)

    International Nuclear Information System (INIS)

    Lu Bin; Dai Ruping; Zhang Shaoxiong; Bai Hua; Jing Baolian; Cao Cheng; He Sha; Ren Li

    1998-01-01

    Purpose: To quantitatively measure cardiovascular diameters and function parameters by using electron beam computed tomography, EBCT. Methods: Men 50 health Chinese people accepted EBCT common transverse and short-axis enhanced movie scan (27 men, 23 women, average age 47.7 years.). The transverse scan was used to measure the diameters of the ascending aorta, descending aorta, pulmonary artery and left atrium. The movie study was used to measure the left ventricular myocardium thickness and analysis global, sectional and segmental function of the right and left ventricles. Results: The cardiovascular diameters and cardiac functional parameters were calculated. The diameters and most functional parameters (end syspoble volume, syspole volume, ejection fraction, cardiac-output, cardiac index) of normal Chinese men were greater than those of women (P>0.05). However, the EDV and MyM(myocardium mass) of both ventricles were significant (p<0.01). Conclusion: EBCT is a minimally invasive method for cardiovascular measurement and cardiac function evaluation

  8. Resistance and relatedness on an evolutionary graph

    Science.gov (United States)

    Maciejewski, Wes

    2012-01-01

    When investigating evolution in structured populations, it is often convenient to consider the population as an evolutionary graph—individuals as nodes, and whom they may act with as edges. There has, in recent years, been a surge of interest in evolutionary graphs, especially in the study of the evolution of social behaviours. An inclusive fitness framework is best suited for this type of study. A central requirement for an inclusive fitness analysis is an expression for the genetic similarity between individuals residing on the graph. This has been a major hindrance for work in this area as highly technical mathematics are often required. Here, I derive a result that links genetic relatedness between haploid individuals on an evolutionary graph to the resistance between vertices on a corresponding electrical network. An example that demonstrates the potential computational advantage of this result over contemporary approaches is provided. This result offers more, however, to the study of population genetics than strictly computationally efficient methods. By establishing a link between gene transfer and electric circuit theory, conceptualizations of the latter can enhance understanding of the former. PMID:21849384

  9. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  10. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  11. Hiding Electronic Patient Record (EPR) in medical images: A high capacity and computationally efficient technique for e-healthcare applications.

    Science.gov (United States)

    Loan, Nazir A; Parah, Shabir A; Sheikh, Javaid A; Akhoon, Jahangir A; Bhat, Ghulam M

    2017-09-01

    A high capacity and semi-reversible data hiding scheme based on Pixel Repetition Method (PRM) and hybrid edge detection for scalable medical images has been proposed in this paper. PRM has been used to scale up the small sized image (seed image) and hybrid edge detection ensures that no important edge information is missed. The scaled up version of seed image has been divided into 2×2 non overlapping blocks. In each block there is one seed pixel whose status decides the number of bits to be embedded in the remaining three pixels of that block. The Electronic Patient Record (EPR)/data have been embedded by using Least Significant and Intermediate Significant Bit Substitution (ISBS). The RC4 encryption has been used to add an additional security layer for embedded EPR/data. The proposed scheme has been tested for various medical and general images and compared with some state of art techniques in the field. The experimental results reveal that the proposed scheme besides being semi-reversible and computationally efficient is capable of handling high payload and as such can be used effectively for electronic healthcare applications. Copyright © 2017. Published by Elsevier Inc.

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. [COMPUTER ASSISTED DESIGN AND ELECTRON BEAMMELTING RAPID PROTOTYPING METAL THREE-DIMENSIONAL PRINTING TECHNOLOGY FOR PREPARATION OF INDIVIDUALIZED FEMORAL PROSTHESIS].

    Science.gov (United States)

    Liu, Hongwei; Weng, Yiping; Zhang, Yunkun; Xu, Nanwei; Tong, Jing; Wang, Caimei

    2015-09-01

    To study the feasibility of preparation of the individualized femoral prosthesis through computer assisted design and electron beammelting rapid prototyping (EBM-RP) metal three-dimensional (3D) printing technology. One adult male left femur specimen was used for scanning with 64-slice spiral CT; tomographic image data were imported into Mimics15.0 software to reconstruct femoral 3D model, then the 3D model of individualized femoral prosthesis was designed through UG8.0 software. Finally the 3D model data were imported into EBM-RP metal 3D printer to print the individualized sleeve. According to the 3D model of individualized prosthesis, customized sleeve was successfully prepared through the EBM-RP metal 3D printing technology, assembled with the standard handle component of SR modular femoral prosthesis to make the individualized femoral prosthesis. Customized femoral prosthesis accurately matching with metaphyseal cavity can be designed through the thin slice CT scanning and computer assisted design technology. Titanium alloy personalized prosthesis with complex 3D shape, pore surface, and good matching with metaphyseal cavity can be manufactured by the technology of EBM-RP metal 3D printing, and the technology has convenient, rapid, and accurate advantages.

  14. Proposal for an ad hoc computer network in the military electronic systems department at the military academy applying bluetooth technology

    Directory of Open Access Journals (Sweden)

    Miroslav R. Terzić

    2011-01-01

    Full Text Available The historical development of the Bluetooth module is given in the introduction of this paper. The importance of the Bluetooth standard for wireless connection on small distances is shown as well. The organization of the Department of Military Electronic Systems is presented with its area of duties, subordinate sections and deployment. The concept of a local area network for this Department, using Bluetooth technology, includes network topology and working regimes based on the main characteristics and technical specifications for the connection with Bluetooth technology. The Department's disperse computer network is proposed as a scatter net where one piconetwork includes the Head of Department and the Heads of Sections while other piconetworks are formed from the Heads of Sections and their subordinates. The security aspect of the presented network deals with basic computer network attack categories, protection methods and aspects. The paper concludes with some recommendations for the local area network using Bluetooth technology with respect to its economical and security aspects as well as to the managing principles of the Department.

  15. Experiments and Computational Theory for Electrical Breakdown in Critical Components: THz Imaging of Electronic Plasmas.

    Energy Technology Data Exchange (ETDEWEB)

    Zutavern, Fred J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hjalmarson, Harold P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bigman, Verle Howard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gallegos, Richard Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-11-01

    This report describes the development of ultra-short pulse laser (USPL) induced terahertz (THz) radiation to image electronic plasmas during electrical breakdown. The technique uses three pulses from two USPLs to (1) trigger the breakdown, (2) create a 2 picosecond (ps, 10 -12 s), THz pulse to illuminate the breakdown, and (3) record the THz image of the breakdown. During this three year internal research program, sub-picosecond jitter timing for the lasers, THz generation, high bandwidth (BW) diagnostics, and THz image acquisition was demonstrated. High intensity THz radiation was optically-induced in a pulse-charged gallium arsenide photoconductive switch. The radiation was collected, transported, concentrated, and co-propagated through an electro-optic crystal with an 800 nm USPL pulse whose polarization was rotated due to the spatially varying electric field of the THz image. The polarization modulated USPL pulse was then passed through a polarizer and the resulting spatially varying intensity was detected in a high resolution digital camera. Single shot images had a signal to noise of %7E3:1. Signal to noise was improved to %7E30:1 with several experimental techniques and by averaging the THz images from %7E4000 laser pulses internally and externally with the camera and the acquisition system (40 pulses per readout). THz shadows of metallic films and objects were also recorded with this system to demonstrate free-carrier absorption of the THz radiation and improve image contrast and resolution. These 2 ps THz pulses were created and resolved with 100 femtosecond (fs, 10 -15 s) long USPL pulses. Thus this technology has the capability to time-resolve extremely fast repetitive or single shot phenomena, such as those that occur during the initiation of electrical breakdown. The goal of imaging electrical breakdown was not reached during this three year project. However, plans to achieve this goal as part of a follow-on project are described in this document

  16. Electronic Publishing.

    Science.gov (United States)

    Lancaster, F. W.

    1989-01-01

    Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…

  17. Complexity in Evolutionary Processes

    International Nuclear Information System (INIS)

    Schuster, P.

    2010-01-01

    Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

  18. Evolutionary games under incompetence.

    Science.gov (United States)

    Kleshnina, Maria; Filar, Jerzy A; Ejov, Vladimir; McKerral, Jody C

    2018-02-26

    The adaptation process of a species to a new environment is a significant area of study in biology. As part of natural selection, adaptation is a mutation process which improves survival skills and reproductive functions of species. Here, we investigate this process by combining the idea of incompetence with evolutionary game theory. In the sense of evolution, incompetence and training can be interpreted as a special learning process. With focus on the social side of the problem, we analyze the influence of incompetence on behavior of species. We introduce an incompetence parameter into a learning function in a single-population game and analyze its effect on the outcome of the replicator dynamics. Incompetence can change the outcome of the game and its dynamics, indicating its significance within what are inherently imperfect natural systems.

  19. Evolutionary economics and industry location

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2003-01-01

    This paper aims to provide the outlines of an evolutionary economic geography of industry location. We discuss two evolutionary explanations of industry location, that is, one that concentrates on spin-offs, and one that focuses attention on knowledge and agglomeration economies. We claim that both

  20. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    These discussions included, among others, the possible consequences of nonDNA-based inheritance—epigenetics and cultural evolution, niche construction, and developmental mechanisms on our understanding of the evolutionary process, speciation, complexity in biology, and constructing a formal evolutionary theory.

  1. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    We are delighted to bring to the readers, a set of peer-reviewed papers on evolutionary biology, published as a special issue of the Journal of Genetics. These papers emanated from ruminations upon and discussions at the Foundations of. Evolutionary Theory: the Ongoing Synthesis meeting at Coorg, India, in February ...

  2. Fixation Time for Evolutionary Graphs

    Science.gov (United States)

    Nie, Pu-Yan; Zhang, Pei-Ai

    Evolutionary graph theory (EGT) is recently proposed by Lieberman et al. in 2005. EGT is successful for explaining biological evolution and some social phenomena. It is extremely important to consider the time of fixation for EGT in many practical problems, including evolutionary theory and the evolution of cooperation. This study characterizes the time to asymptotically reach fixation.

  3. Applications of evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.; Puranam, Krishna Kishore; Ravi Kumar Jain B., xx

    2008-01-01

    This paper is written as the first chapter of an edited volume on evolutionary economics and economic geography (Frenken, K., editor, Applied Evolutionary Economics and Economic Geography, Cheltenham: Edward Elgar, expected publication date February 2007). The paper reviews empirical applications of

  4. Optimizing research in symptomatic uterine fibroids with development of a computable phenotype for use with electronic health records.

    Science.gov (United States)

    Hoffman, Sarah R; Vines, Anissa I; Halladay, Jacqueline R; Pfaff, Emily; Schiff, Lauren; Westreich, Daniel; Sundaresan, Aditi; Johnson, La-Shell; Nicholson, Wanda K

    2018-06-01

    Women with symptomatic uterine fibroids can report a myriad of symptoms, including pain, bleeding, infertility, and psychosocial sequelae. Optimizing fibroid research requires the ability to enroll populations of women with image-confirmed symptomatic uterine fibroids. Our objective was to develop an electronic health record-based algorithm to identify women with symptomatic uterine fibroids for a comparative effectiveness study of medical or surgical treatments on quality-of-life measures. Using an iterative process and text-mining techniques, an effective computable phenotype algorithm, composed of demographics, and clinical and laboratory characteristics, was developed with reasonable performance. Such algorithms provide a feasible, efficient way to identify populations of women with symptomatic uterine fibroids for the conduct of large traditional or pragmatic trials and observational comparative effectiveness studies. Symptomatic uterine fibroids, due to menorrhagia, pelvic pain, bulk symptoms, or infertility, are a source of substantial morbidity for reproductive-age women. Comparing Treatment Options for Uterine Fibroids is a multisite registry study to compare the effectiveness of hormonal or surgical fibroid treatments on women's perceptions of their quality of life. Electronic health record-based algorithms are able to identify large numbers of women with fibroids, but additional work is needed to develop electronic health record algorithms that can identify women with symptomatic fibroids to optimize fibroid research. We sought to develop an efficient electronic health record-based algorithm that can identify women with symptomatic uterine fibroids in a large health care system for recruitment into large-scale observational and interventional research in fibroid management. We developed and assessed the accuracy of 3 algorithms to identify patients with symptomatic fibroids using an iterative approach. The data source was the Carolina Data Warehouse for

  5. Evolutionary Explanations of Eating Disorders

    Directory of Open Access Journals (Sweden)

    Igor Kardum

    2008-12-01

    Full Text Available This article reviews several most important evolutionary mechanisms that underlie eating disorders. The first part clarifies evolutionary foundations of mental disorders and various mechanisms leading to their development. In the second part selective pressures and evolved adaptations causing contemporary epidemic of obesity as well as differences in dietary regimes and life-style between modern humans and their ancestors are described. Concerning eating disorders, a number of current evolutionary explanations of anorexia nervosa are presented together with their main weaknesses. Evolutionary explanations of eating disorders based on the reproductive suppression hypothesis and its variants derived from kin selection theory and the model of parental manipulation were elaborated. The sexual competition hypothesis of eating disorder, adapted to flee famine hypothesis as well as explanation based on the concept of social attention holding power and the need to belonging were also explained. The importance of evolutionary theory in modern conceptualization and research of eating disorders is emphasized.

  6. Parallel Evolutionary Optimization for Neuromorphic Network Training

    Energy Technology Data Exchange (ETDEWEB)

    Schuman, Catherine D [ORNL; Disney, Adam [University of Tennessee (UT); Singh, Susheela [North Carolina State University (NCSU), Raleigh; Bruer, Grant [University of Tennessee (UT); Mitchell, John Parker [University of Tennessee (UT); Klibisz, Aleksander [University of Tennessee (UT); Plank, James [University of Tennessee (UT)

    2016-01-01

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impact the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.

  7. An Angiotensin II type 1 receptor activation switch patch revealed through Evolutionary Trace analysis

    DEFF Research Database (Denmark)

    Bonde, Marie Mi; Yao, Rong; Ma, Jian-Nong

    2010-01-01

    to be completely resolved. Evolutionary Trace (ET) analysis is a computational method, which identifies clusters of functionally important residues by integrating information on evolutionary important residue variations with receptor structure. Combined with known mutational data, ET predicted a patch of residues......) displayed phenotypes associated with changed activation state, such as increased agonist affinity or basal activity, promiscuous activation, or constitutive internalization highlighting the importance of testing different signaling pathways. We conclude that this evolutionary important patch mediates...

  8. Comparison of electron beam computed tomography and exercise electrocardiography in detecting coronary artery disease in the elderly

    International Nuclear Information System (INIS)

    Inoue, Shinji; Mitsunami, Kenichi; Kinoshita, Masahiko

    1998-01-01

    Although exercise electrocardiography (ECG) is a useful noninvasive screening test for coronary artery disease (CAD), one prerequisite for ECG screening is that patient be able to exercise enough to evoke myocardial ischemia. Thus, exercise ECG may not be suitable for, some elderly people with CAD who cannot exercise enough. We compared electron beam Computed Tomography (EBCT) with exercise ECG for detecting CAD in 196 patients (mean age, 58.4±12.5 (standard deviation)) who had undergone coronary angiography. Using the angiographic findings as the ''gold standard'', we found that the sensitivity, specificity, positive predictive value, and negative predictive value were 88%, 77%, 89%, and 77%, respectively, for EBCT, and 66%, 72%, 83%, and 52%, respectively, for exercise ECG. Although the results were similar when the subjects were divided into different age groups, the negative predictive value for exercise ECG, among older patients was very low. These findings suggest that EBCT is superior to exercise ECG in detecting CAD in the elderly. (author)

  9. Tracheomalacia before and after aortosternopexy: dynamic and quantitative assessment by electron-beam computed tomography with clinical correlation

    International Nuclear Information System (INIS)

    Kao, S.C.S.; Kimura, K.; Smith, W.L.; Sato, Y.

    1995-01-01

    To correlate the dynamics of tracheal collapse with clinical upper airway obstruction before and after aortosternopexy, seven boys and three girls (mean age, 10 months) underwent dynamic evaluation of the trachea by electron-beam computed tomography (EBCT). The site, extent, and severity of collapse were correlated with symptomatology and details of operative procedure. When >50% area collapse was used as the criterion for tracheomalacia, segmental involvement occurred above the aortic arch in all patients, extending to the aortic arch level in only four. Tracheomalacia involved two or fewer 8-mm levels in seven patients and more than two levels in three. Eight patients underwent one aortosternopexy procedure, resulting in clinical improvement in six and correlating well with EBCT findings. Of the remaining two patients who had single aortosternopexy and did not show clinical and radiographic improvement, one required operative repair of a vascular ring and the other continued to have recurrent respiratory tract infections. On the basis of EBCT findings, two patients required additional innominate arteriopexies: One improved, and the other remained symptomatic, requiring tracheostomy. EBCT is a noninvasive modality that allows preoperative diagnosis of tracheomalacia. More importantly, the operative decision and technique are guided by an objective and quantitative assessment of tracheal collapse. (orig.)

  10. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  11. ETRAN 1999: Society for Electronics,Telecommunications, Computers, Automation and Nuclear Engineering. Section for Nuclear Techniques and Technology. Proceedings of the XLIII Conference. Vol IV

    International Nuclear Information System (INIS)

    Spasojevic, D.; Smiljanic, M.; Bozic, D.; Stankovic, D.

    1999-01-01

    The XLIII ETRAN Conference of the Society for Electronic, Telecommunications, Computers, Automation and Nuclear Engineering was held on 20-22 Sep, 1999. In the Proceedings of the Conference The Commission of the Nuclear Technique and Technology has 19 papers presented in three sessions.

  12. ETRAN 2002: Society for Electronics,Telecommunications, Computers, Automation and Nuclear Engineering. Section for Nuclear Techniques and Technology. Proceedings of the XLVI Conference. Vol IV

    International Nuclear Information System (INIS)

    Milosevic, M.; Jaksic, Z.; Bozic, D.; Potkonjak, V.

    2002-01-01

    The XLVI ETRAN Conference of the Society for Electronic, Telecommunications, Computers, Automation and Nuclear Engineering was held on 4-7 June, 2002. In the Proceedings of the Conference The Commission of the Nuclear Technique and Technology has 14 papers presented in three following sessions: 1. Actual problems in nuclear technologies; 2. Accelerator and reactor systems; and 3. Radiation protection and ionizing radiation uses

  13. Chemical evolutionary games.

    Science.gov (United States)

    Aristotelous, Andreas C; Durrett, Richard

    2014-05-01

    Inspired by the use of hybrid cellular automata in modeling cancer, we introduce a generalization of evolutionary games in which cells produce and absorb chemicals, and the chemical concentrations dictate the death rates of cells and their fitnesses. Our long term aim is to understand how the details of the interactions in a system with n species and m chemicals translate into the qualitative behavior of the system. Here, we study two simple 2×2 games with two chemicals and revisit the two and three species versions of the one chemical colicin system studied earlier by Durrett and Levin (1997). We find that in the 2×2 examples, the behavior of our new spatial model can be predicted from that of the mean field differential equation using ideas of Durrett and Levin (1994). However, in the three species colicin model, the system with diffusion does not have the coexistence which occurs in the lattices model in which sites interact with only their nearest neighbors. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Evolutionary and developmental modules.

    Science.gov (United States)

    Lacquaniti, Francesco; Ivanenko, Yuri P; d'Avella, Andrea; Zelik, Karl E; Zago, Myrka

    2013-01-01

    The identification of biological modules at the systems level often follows top-down decomposition of a task goal, or bottom-up decomposition of multidimensional data arrays into basic elements or patterns representing shared features. These approaches traditionally have been applied to mature, fully developed systems. Here we review some results from two other perspectives on modularity, namely the developmental and evolutionary perspective. There is growing evidence that modular units of development were highly preserved and recombined during evolution. We first consider a few examples of modules well identifiable from morphology. Next we consider the more difficult issue of identifying functional developmental modules. We dwell especially on modular control of locomotion to argue that the building blocks used to construct different locomotor behaviors are similar across several animal species, presumably related to ancestral neural networks of command. A recurrent theme from comparative studies is that the developmental addition of new premotor modules underlies the postnatal acquisition and refinement of several different motor behaviors in vertebrates.

  15. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  16. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    Science.gov (United States)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  17. Monte Carlo computation of Bremsstrahlung intensity and energy spectrum from a 15 MV linear electron accelerator tungsten target to optimise LINAC head shielding

    International Nuclear Information System (INIS)

    Biju, K.; Sharma, Amiya; Yadav, R.K.; Kannan, R.; Bhatt, B.C.

    2003-01-01

    The knowledge of exact photon intensity and energy distributions from the target of an electron target is necessary while designing the shielding for the accelerator head from radiation safety point of view. The computations were carried out for the intensity and energy distribution of photon spectrum from a 0.4 cm thick tungsten target in different angular directions for 15 MeV electrons using a validated Monte Carlo code MCNP4A. Similar results were computed for 30 MeV electrons and found agreeing with the data available in literature. These graphs and the TVT values in lead help to suggest an optimum shielding thickness for 15 MV Linac head. (author)

  18. Dose field simulation for products irradiated by electron beams: formulation of the problem and its step by step solution with EGS4 computer code

    International Nuclear Information System (INIS)

    Rakhno, I.L.; Roginets, L.P.

    1999-01-01

    When performing radiation treatment of products using an electron beam much time and money should be spent for numerous measurements to make optimal choice of treatment mode. Direct radiation treatment simulation by means of the EGS4 computer code fails to describe such measurement results correctly. In the paper a multi-step radiation treatment planning procedure is suggested which consists in fitting the EGS4 simulation results to reference measurement results, and using the fitted electron beam parameters and other ones in subsequent computer simulations. It is shown that the fitting procedure should be performed separately for each material or product type. The procedure suggested allows to replace measurements by computer simulations and therefore reduces significantly time and money required for such measurements. (author)

  19. Experimental study of matrix carbon field-emission cathodes and computer aided design of electron guns for microwave power devices, exploring these cathodes

    International Nuclear Information System (INIS)

    Grigoriev, Y.A.; Petrosyan, A.I.; Penzyakov, V.V.; Pimenov, V.G.; Rogovin, V.I.; Shesterkin, V.I.; Kudryashov, V.P.; Semyonov, V.C.

    1997-01-01

    The experimental study of matrix carbon field-emission cathodes (MCFECs), which has led to the stable operation of the cathodes with current emission values up to 100 mA, is described. A method of computer aided design of TWT electron guns (EGs) with MCFEC, based on the results of the MCFEC emission experimental study, is presented. The experimental MCFEC emission characteristics are used to define the field gain coefficient K and the cathode effective emission area S eff . The EG program computes the electric field upon the MCFEC surface, multiplies it by the K value and uses the Fowler Nordheim law and the S eff value to calculate the MCFEC current; the electron trajectories are computed as well. copyright 1997 American Vacuum Society

  20. 1D numerical simulation of charge trapping in an insulator submitted to an electron beam irradiation. Part I: Computation of the initial secondary electron emission yield

    International Nuclear Information System (INIS)

    Aoufi, A.; Damamme, G.

    2011-01-01

    The aim of this work is to study by numerical simulation a mathematical modelling technique describing charge trapping during initial charge injection in an insulator submitted to electron beam irradiation. A two-fluxes method described by a set of two stationary transport equations is used to split the electron current j e (z) into coupled forward j e+ (z) and backward j e (z) currents and such that j e (z) = j e+ (z) - j e- (z). The sparse algebraic linear system, resulting from the vertex-centered finite-volume discretization scheme is solved by an iterative decoupled fixed point method which involves the direct inversion of a bi-diagonal matrix. The sensitivity of the initial secondary electron emission yield with respect to the energy of incident primary electrons beam, that is penetration depth of the incident beam, or electron cross sections (absorption and diffusion) is investigated by numerical simulations. (authors)