WorldWideScience

Sample records for computation inquantitative genetics

  1. Strategies for MCMC computation inquantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánēz-Escriche, Noelia; Sorensen, Daniel

    both in size and regarding the inferences concerning the genetic covariance parameters. Section 2 discusses general strategies for obtaining efficient MCMC algorithms while Section 3 considers these strategies in the specific context of the San Cristobal-Gaudy et al. (1998) model. Section 4 presents...... be implemented relatively straightforwardly. The assumptions of normality, linearity, and variance homogeneity are in many cases not valid. One may then consider generalized linear mixed models where the genetic random effects enter at the level of the linear predictor. San Cristobal-Gaudy et al. (1998) proposed...... likelihood inference is complicated since it is not possible to evaluate explicitly the likelihood function and conventional Gibbs sampling is difficult since the full conditional distributions are not anymore of standard forms. The aim of this paper is to discuss strategies to obtain efficient Markov chain...

  2. Quantum Genetic Algorithms for Computer Scientists

    OpenAIRE

    Rafael Lahoz-Beltra

    2016-01-01

    Genetic algorithms (GAs) are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data) has led to a new class of GAs known as “Quantum Geneti...

  3. Quantum Genetic Algorithms for Computer Scientists

    Directory of Open Access Journals (Sweden)

    Rafael Lahoz-Beltra

    2016-10-01

    Full Text Available Genetic algorithms (GAs are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data has led to a new class of GAs known as “Quantum Genetic Algorithms” (QGAs. In this review, we present a discussion, future potential, pros and cons of this new class of GAs. The review will be oriented towards computer scientists interested in QGAs “avoiding” the possible difficulties of quantum-mechanical phenomena.

  4. GENMAP--A Microbial Genetics Computer Simulation.

    Science.gov (United States)

    Day, M. J.; And Others

    1985-01-01

    An interactive computer program in microbial genetics is described. The simulation allows students to work at their own pace and develop understanding of microbial techniques as they choose donor bacterial strains, specify selective media, and interact with demonstration experiments. Sample questions and outputs are included. (DH)

  5. Genetic algorithms in computer aided inductor design

    OpenAIRE

    Jean Fivaz; Willem A. Cronjé

    2004-01-01

    The goal of this investigation is to determine the advantages of using genetic algorithms in computer-aided design as applied to inductors.  These advantages are exploited in design problems with a number of specifications and constraints, as encountered in power electronics during practical inductor design. The design tool should be able to select components, such as cores and wires, from databases of available components, and evaluate these choices based on the components’ characteristic d...

  6. Systolic array IC for genetic computation

    Science.gov (United States)

    Anderson, D.

    1991-01-01

    Measuring similarities between large sequences of genetic information is a formidable task requiring enormous amounts of computer time. Geneticists claim that nearly two months of CRAY-2 time are required to run a single comparison of the known database against the new bases that will be found this year, and more than a CRAY-2 year for next year's genetic discoveries, and so on. The DNA IC, designed at HP-ICBD in cooperation with the California Institute of Technology and the Jet Propulsion Laboratory, is being implemented in order to move the task of genetic comparison onto workstations and personal computers, while vastly improving performance. The chip is a systolic (pumped) array comprised of 16 processors, control logic, and global RAM, totaling 400,000 FETS. At 12 MHz, each chip performs 2.7 billion 16 bit operations per second. Using 35 of these chips in series on one PC board (performing nearly 100 billion operations per second), a sequence of 560 bases can be compared against the eventual total genome of 3 billion bases, in minutes--on a personal computer. While the designed purpose of the DNA chip is for genetic research, other disciplines requiring similarity measurements between strings of 7 bit encoded data could make use of this chip as well. Cryptography and speech recognition are two examples. A mix of full custom design and standard cells, in CMOS34, were used to achieve these goals. Innovative test methods were developed to enhance controllability and observability in the array. This paper describes these techniques as well as the chip's functionality. This chip was designed in the 1989-90 timeframe.

  7. Investigating European genetic history through computer simulations.

    Science.gov (United States)

    Currat, Mathias; Silva, Nuno M

    2013-01-01

    The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.

  8. Genetic algorithms in computer aided inductor design

    Directory of Open Access Journals (Sweden)

    Jean Fivaz

    2004-09-01

    Full Text Available The goal of this investigation is to determine the advantages of using genetic algorithms in computer-aided design as applied to inductors.  These advantages are exploited in design problems with a number of specifications and constraints, as encountered in power electronics during practical inductor design. The design tool should be able to select components, such as cores and wires, from databases of available components, and evaluate these choices based on the components’ characteristic data read from a database of manufacturers’ data-sheets.  The proposed design must always be practically realizable, as close to the desired specifications as possible and within any specified constraints.

  9. Use of Computer Simulations in Microbial and Molecular Genetics.

    Science.gov (United States)

    Wood, Peter

    1984-01-01

    Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

  10. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  11. Teaching Mendelian Genetics with the Computer.

    Science.gov (United States)

    Small, James W., Jr.

    Students in general undergraduate courses in both biology and genetics seem to have great difficulty mastering the basic concepts of Mendelian Genetics and solving even simple problems. In an attempt to correct this situation, students in both courses at Rollins College were introduced to three simulation models of the genetics of the fruit…

  12. Utility of computer simulations in landscape genetics

    Science.gov (United States)

    Bryan K. Epperson; Brad H. McRae; Kim Scribner; Samuel A. Cushman; Michael S. Rosenberg; Marie-Josee Fortin; Patrick M. A. James; Melanie Murphy; Stephanie Manel; Pierre Legendre; Mark R. T. Dale

    2010-01-01

    Population genetics theory is primarily based on mathematical models in which spatial complexity and temporal variability are largely ignored. In contrast, the field of landscape genetics expressly focuses on how population genetic processes are affected by complex spatial and temporal environmental heterogeneity. It is spatially explicit and relates patterns to...

  13. Teaching Mendelian Genetics with the Computer.

    Science.gov (United States)

    Small, James W., Jr.

    Students in general undergraduate courses in both biology and genetics seem to have great difficulty mastering the basic concepts of Mendelian Genetics and solving even simple problems. In an attempt to correct this situation, students in both courses at Rollins College were introduced to three simulation models of the genetics of the fruit…

  14. Genetic crossing vs cloning by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, S. [Cologne Univ., Koeln (Germany)

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  15. Genetic Crossing vs Cloning by Computer Simulation

    Science.gov (United States)

    Dasgupta, Subinay

    We perform Monte Carlo simulation using Penna's bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  16. 10th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Wang, Chia-Hung; Jiang, Xin

    2017-01-01

    This book gathers papers presented at the 10th International Conference on Genetic and Evolutionary Computing (ICGEC 2016). The conference was co-sponsored by Springer, Fujian University of Technology in China, the University of Computer Studies in Yangon, University of Miyazaki in Japan, National Kaohsiung University of Applied Sciences in Taiwan, Taiwan Association for Web Intelligence Consortium, and VSB-Technical University of Ostrava, Czech Republic. The ICGEC 2016, which was held from November 7 to 9, 2016 in Fuzhou City, China, was intended as an international forum for researchers and professionals in all areas of genetic and evolutionary computing.

  17. Computational power and generative capacity of genetic systems.

    Science.gov (United States)

    Igamberdiev, Abir U; Shklovskiy-Kordi, Nikita E

    2016-01-01

    Semiotic characteristics of genetic sequences are based on the general principles of linguistics formulated by Ferdinand de Saussure, such as the arbitrariness of sign and the linear nature of the signifier. Besides these semiotic features that are attributable to the basic structure of the genetic code, the principle of generativity of genetic language is important for understanding biological transformations. The problem of generativity in genetic systems arises to a possibility of different interpretations of genetic texts, and corresponds to what Alexander von Humboldt called "the infinite use of finite means". These interpretations appear in the individual development as the spatiotemporal sequences of realizations of different textual meanings, as well as the emergence of hyper-textual statements about the text itself, which underlies the process of biological evolution. These interpretations are accomplished at the level of the readout of genetic texts by the structures defined by Efim Liberman as "the molecular computer of cell", which includes DNA, RNA and the corresponding enzymes operating with molecular addresses. The molecular computer performs physically manifested mathematical operations and possesses both reading and writing capacities. Generativity paradoxically resides in the biological computational system as a possibility to incorporate meta-statements about the system, and thus establishes the internal capacity for its evolution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Genetic Algorithm Modeling with GPU Parallel Computing Technology

    CERN Document Server

    Cavuoti, Stefano; Brescia, Massimo; Pescapé, Antonio; Longo, Giuseppe; Ventre, Giorgio

    2012-01-01

    We present a multi-purpose genetic algorithm, designed and implemented with GPGPU / CUDA parallel computing technology. The model was derived from a multi-core CPU serial implementation, named GAME, already scientifically successfully tested and validated on astrophysical massive data classification problems, through a web application resource (DAMEWARE), specialized in data mining based on Machine Learning paradigms. Since genetic algorithms are inherently parallel, the GPGPU computing paradigm has provided an exploit of the internal training features of the model, permitting a strong optimization in terms of processing performances and scalability.

  19. High-Throughput Neuroimaging-Genetics Computational Infrastructure

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2014-04-01

    Full Text Available Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate and disseminate novel scientific methods, computational resources and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval and aggregation. Computational processing involves the necessary software, hardware and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical and phenotypic data and meta-data. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI and the Laboratory of Neuro Imaging (LONI at University of Southern California (USC. INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer’s and Parkinson’s data, we provide several examples of translational applications using this infrastructure.

  20. High-throughput neuroimaging-genetics computational infrastructure.

    Science.gov (United States)

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D; Franco, Joseph; Toga, Arthur W

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  1. 9th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro; Genetic and Evolutionary Computing

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2015, the 9th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Ministry of Science and Technology, Myanmar, University of Computer Studies, Yangon, University of Miyazaki in Japan, Kaohsiung University of Applied Science in Taiwan, Fujian University of Technology in China and VSB-Technical University of Ostrava. ICGEC 2015 is held from 26-28, August, 2015 in Yangon, Myanmar. Yangon, the most multiethnic and cosmopolitan city in Myanmar, is the main gateway to the country. Despite being the commercial capital of Myanmar, Yangon is a city engulfed by its rich history and culture, an integration of ancient traditions and spiritual heritage. The stunning SHWEDAGON Pagoda is the center piece of Yangon city, which itself is famous for the best British colonial era architecture. Of particular interest in many shops of Bogyoke Aung San Market,...

  2. 8th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Yang, Chin-Yu; Lin, Chun-Wei; Pan, Jeng-Shyang; Snasel, Vaclav; Abraham, Ajith

    2015-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2014, the 8th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Nanchang Institute of Technology in China, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2014 is held from 18-20 October 2014 in Nanchang, China. Nanchang is one of is the capital of Jiangxi Province in southeastern China, located in the north-central portion of the province. As it is bounded on the west by the Jiuling Mountains, and on the east by Poyang Lake, it is famous for its scenery, rich history and cultural sites. Because of its central location relative to the Yangtze and Pearl River Delta regions, it is a major railroad hub in Southern China. The conference is intended as an international forum for the researchers and professionals in all areas of genetic and evolutionary computing.

  3. 7th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Krömer, Pavel; Snášel, Václav

    2014-01-01

    Genetic and Evolutionary Computing This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2013, the 7th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by The Waseda University in Japan, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2013 was held in Prague, Czech Republic. Prague is one of the most beautiful cities in the world whose magical atmosphere has been shaped over ten centuries. Places of the greatest tourist interest are on the Royal Route running from the Powder Tower through Celetna Street to Old Town Square, then across Charles Bridge through the Lesser Town up to the Hradcany Castle. One should not miss the Jewish Town, and the National Gallery with its fine collection of Czech Gothic art, collection of old European art, and a beautiful collection of French art. The conference was intended as an international forum for the res...

  4. An Agent Inspired Reconfigurable Computing Implementation of a Genetic Algorithm

    Science.gov (United States)

    Weir, John M.; Wells, B. Earl

    2003-01-01

    Many software systems have been successfully implemented using an agent paradigm which employs a number of independent entities that communicate with one another to achieve a common goal. The distributed nature of such a paradigm makes it an excellent candidate for use in high speed reconfigurable computing hardware environments such as those present in modem FPGA's. In this paper, a distributed genetic algorithm that can be applied to the agent based reconfigurable hardware model is introduced. The effectiveness of this new algorithm is evaluated by comparing the quality of the solutions found by the new algorithm with those found by traditional genetic algorithms. The performance of a reconfigurable hardware implementation of the new algorithm on an FPGA is compared to traditional single processor implementations.

  5. Computational Genetic Regulatory Networks Evolvable, Self-organizing Systems

    CERN Document Server

    Knabe, Johannes F

    2013-01-01

    Genetic Regulatory Networks (GRNs) in biological organisms are primary engines for cells to enact their engagements with environments, via incessant, continually active coupling. In differentiated multicellular organisms, tremendous complexity has arisen in the course of evolution of life on earth. Engineering and science have so far achieved no working system that can compare with this complexity, depth and scope of organization. Abstracting the dynamics of genetic regulatory control to a computational framework in which artificial GRNs in artificial simulated cells differentiate while connected in a changing topology, it is possible to apply Darwinian evolution in silico to study the capacity of such developmental/differentiated GRNs to evolve. In this volume an evolutionary GRN paradigm is investigated for its evolvability and robustness in models of biological clocks, in simple differentiated multicellularity, and in evolving artificial developing 'organisms' which grow and express an ontogeny starting fr...

  6. Quantum Genetics in terms of Quantum Reversible Automata and Quantum Computation of Genetic Codes and Reverse Transcription

    CERN Document Server

    Baianu,I C

    2004-01-01

    The concepts of quantum automata and quantum computation are studied in the context of quantum genetics and genetic networks with nonlinear dynamics. In previous publications (Baianu,1971a, b) the formal concept of quantum automaton and quantum computation, respectively, were introduced and their possible implications for genetic processes and metabolic activities in living cells and organisms were considered. This was followed by a report on quantum and abstract, symbolic computation based on the theory of categories, functors and natural transformations (Baianu,1971b; 1977; 1987; 2004; Baianu et al, 2004). The notions of topological semigroup, quantum automaton, or quantum computer, were then suggested with a view to their potential applications to the analogous simulation of biological systems, and especially genetic activities and nonlinear dynamics in genetic networks. Further, detailed studies of nonlinear dynamics in genetic networks were carried out in categories of n-valued, Lukasiewicz Logic Algebra...

  7. Genetic braid optimization: A heuristic approach to compute quasiparticle braids

    Science.gov (United States)

    McDonald, Ross B.; Katzgraber, Helmut G.

    2013-02-01

    In topologically protected quantum computation, quantum gates can be carried out by adiabatically braiding two-dimensional quasiparticles, reminiscent of entangled world lines. Bonesteel [Phys. Rev. Lett.10.1103/PhysRevLett.95.140503 95, 140503 (2005)], as well as Leijnse and Flensberg [Phys. Rev. B10.1103/PhysRevB.86.104511 86, 104511 (2012)], recently provided schemes for computing quantum gates from quasiparticle braids. Mathematically, the problem of executing a gate becomes that of finding a product of the generators (matrices) in that set that approximates the gate best, up to an error. To date, efficient methods to compute these gates only strive to optimize for accuracy. We explore the possibility of using a generic approach applicable to a variety of braiding problems based on evolutionary (genetic) algorithms. The method efficiently finds optimal braids while allowing the user to optimize for the relative utilities of accuracy and/or length. Furthermore, when optimizing for error only, the method can quickly produce efficient braids.

  8. Noise reduction in selective computational ghost imaging using genetic algorithm

    Science.gov (United States)

    Zafari, Mohammad; Ahmadi-Kandjani, Sohrab; Kheradmand, Reza

    2017-03-01

    Recently, we have presented a selective computational ghost imaging (SCGI) method as an advanced technique for enhancing the security level of the encrypted ghost images. In this paper, we propose a modified method to improve the ghost image quality reconstructed by SCGI technique. The method is based on background subtraction using genetic algorithm (GA) which eliminates background noise and gives background-free ghost images. Analyzing the universal image quality index by using experimental data proves the advantage of this modification method. In particular, the calculated value of the image quality index for modified SCGI over 4225 realization shows an 11 times improvement with respect to SCGI technique. This improvement is 20 times in comparison to conventional CGI technique.

  9. Genetic and computational identification of a conserved bacterial metabolic module.

    Directory of Open Access Journals (Sweden)

    Cara C Boutte

    2008-12-01

    Full Text Available We have experimentally and computationally defined a set of genes that form a conserved metabolic module in the alpha-proteobacterium Caulobacter crescentus and used this module to illustrate a schema for the propagation of pathway-level annotation across bacterial genera. Applying comprehensive forward and reverse genetic methods and genome-wide transcriptional analysis, we (1 confirmed the presence of genes involved in catabolism of the abundant environmental sugar myo-inositol, (2 defined an operon encoding an ABC-family myo-inositol transmembrane transporter, and (3 identified a novel myo-inositol regulator protein and cis-acting regulatory motif that control expression of genes in this metabolic module. Despite being encoded from non-contiguous loci on the C. crescentus chromosome, these myo-inositol catabolic enzymes and transporter proteins form a tightly linked functional group in a computationally inferred network of protein associations. Primary sequence comparison was not sufficient to confidently extend annotation of all components of this novel metabolic module to related bacterial genera. Consequently, we implemented the Graemlin multiple-network alignment algorithm to generate cross-species predictions of genes involved in myo-inositol transport and catabolism in other alpha-proteobacteria. Although the chromosomal organization of genes in this functional module varied between species, the upstream regions of genes in this aligned network were enriched for the same palindromic cis-regulatory motif identified experimentally in C. crescentus. Transposon disruption of the operon encoding the computationally predicted ABC myo-inositol transporter of Sinorhizobium meliloti abolished growth on myo-inositol as the sole carbon source, confirming our cross-genera functional prediction. Thus, we have defined regulatory, transport, and catabolic genes and a cis-acting regulatory sequence that form a conserved module required for myo

  10. Genetic algorithm in DNA computing:A solution to the maximal clique problem

    Institute of Scientific and Technical Information of China (English)

    LI Yuan; FANG Chen; OUYANG Qi

    2004-01-01

    Genetic algorithm is one of the possible ways to break the limit of brute-force method in DNA computing. Using the idea of Darwinian evolution, we introduce a genetic DNA computing algorithm to solve the maximal clique problem. All the operations in the algorithm are accessible with today's molecular biotechnology. Our computer simulations show that with this new computing algorithm, it is possible to get a solution from a very small initial data pool, avoiding enumerating all candidate solutions. For randomly generated problems, genetic algorithm can give correct solution within a few cycles at high probability. Although the current speed of a DNA computer is slow compared with silicon computers, our simulation indicates that the number of cycles needed in this genetic algorithm is approximately a linear function of the number of vertices in the network. This may make DNA computers more powerfully attacking some hard computational problems.

  11. Cardio Vascular Detection with Neuro Computing and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    T. John Peter

    2014-09-01

    Full Text Available For human the most fundamental requirement is having a healthy life, which is being difficult to maintain day to day as we are getting more progress in technological era. Among the possible reasons of unnatural death, heart disease based causes are showing very significant part. The diagnosis of heart diseases is a vital and intricate job. The recognition of heart disease from diverse features or signs is a multi-layered problem that is highly sensitive with respect diagnostic tests and establishing the relationship with multiple parameters is very difficult. In result decision is not free from false assumptions and is frequently accompanied by impulsive effects. This encourages developing a more reliable and cost effective knowledge based algorithmic approach to detect the heart disease. From engineering point of view, solution for detecting the presence of heart diseases is developed with the concept of artificial intelligence in data mining in this study. Feed forward architecture of neural network technology is taken as platform of computation to generate the intelligence in association with well established field of genetic algorithm (GA. A comparative performance has presented between both learning concepts with various different size of architecture.

  12. Granularity of Knowledge Computed by Genetic Algorithms Based on Rough Sets Theory

    Institute of Scientific and Technical Information of China (English)

    Wenyuan Yang; Xiaoping Ye; Yong Tang; Pingping Wei

    2006-01-01

    Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem solution of complex system without depending on the domain of problem. It is robust to many kinds of problems. The paper combines Genetic Algorithms and rough sets theory to compute granular of knowledge through an example of information table. The combination enable us to compute granular of knowledge effectively. It is also useful for computer auto-computing and information processing.

  13. BengaSaVex: A new computational genetic sequence extraction tool ...

    African Journals Online (AJOL)

    BengaSaVex : A new computational genetic sequence extraction tool for DNA repeats. ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... This research aimed to develop new tools for extracting DNA repeats from the ...

  14. Genetic Algorithm in the Computation of the Camera External Orientation

    Directory of Open Access Journals (Sweden)

    Rudolf Urban

    2012-12-01

    Full Text Available The article addresses the solution of the external orientation of the camera by means of a generic algorithm which replaces complicated calculation models using the matrix inverse. The computation requires the knowledge of four control points in the spatial coordinate system and the image coordinate system. The computation procedure fits very well computer-based solutions thanks to it being very simple.

  15. GENEVIEW and the DNACE data bus: computational tools for analysis, display and exchange of genetic information.

    OpenAIRE

    1986-01-01

    We describe an interactive computational tool, GENEVIEW, that allows the scientist to retrieve, analyze, display and exchange genetic information. The scientist may request a display of information from a GenBank locus, request that a restriction map be computed, stored and superimposed on GenBank information, and interactively view this information. GENEVIEW provides an interface between the GenBank data base and the programs of the Lilly DNA Computing Environment (DNACE). This interface sto...

  16. Parallel Genetic Algorithms with Dynamic Topology using Cluster Computing

    Directory of Open Access Journals (Sweden)

    ADAR, N.

    2016-08-01

    Full Text Available A parallel genetic algorithm (PGA conducts a distributed meta-heuristic search by employing genetic algorithms on more than one subpopulation simultaneously. PGAs migrate a number of individuals between subpopulations over generations. The layout that facilitates the interactions of the subpopulations is called the topology. Static migration topologies have been widely incorporated into PGAs. In this article, a PGA with a dynamic migration topology (D-PGA is proposed. D-PGA generates a new migration topology in every epoch based on the average fitness values of the subpopulations. The D-PGA has been tested against ring and fully connected migration topologies in a Beowulf Cluster. The D-PGA has outperformed the ring migration topology with comparable communication cost and has provided competitive or better results than a fully connected migration topology with significantly lower communication cost. PGA convergence behaviors have been analyzed in terms of the diversities within and between subpopulations. Conventional diversity can be considered as the diversity within a subpopulation. A new concept of permeability has been introduced to measure the diversity between subpopulations. It is shown that the success of the proposed D-PGA can be attributed to maintaining a high level of permeability while preserving diversity within subpopulations.

  17. Genetic crossovers are predicted accurately by the computed human recombination map.

    Directory of Open Access Journals (Sweden)

    Pavel P Khil

    2010-01-01

    Full Text Available Hotspots of meiotic recombination can change rapidly over time. This instability and the reported high level of inter-individual variation in meiotic recombination puts in question the accuracy of the calculated hotspot map, which is based on the summation of past genetic crossovers. To estimate the accuracy of the computed recombination rate map, we have mapped genetic crossovers to a median resolution of 70 Kb in 10 CEPH pedigrees. We then compared the positions of crossovers with the hotspots computed from HapMap data and performed extensive computer simulations to compare the observed distributions of crossovers with the distributions expected from the calculated recombination rate maps. Here we show that a population-averaged hotspot map computed from linkage disequilibrium data predicts well present-day genetic crossovers. We find that computed hotspot maps accurately estimate both the strength and the position of meiotic hotspots. An in-depth examination of not-predicted crossovers shows that they are preferentially located in regions where hotspots are found in other populations. In summary, we find that by combining several computed population-specific maps we can capture the variation in individual hotspots to generate a hotspot map that can predict almost all present-day genetic crossovers.

  18. Computation of Rolling Stand Parameters by Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    František Ďurovský

    2008-05-01

    Full Text Available Mathematical model of rolling process is used at cold mill rolling on tandemmills in metallurgy. The model goal is to analyse rolling process according to process datameasured on the mill and get immeasurable variables necessary for rolling control andoptimal mill pre-set for next rolled coil. The values obtained by model are used asreferences for superimposed technology controllers (thickness, speed, tension, etc. as well.Considering wide steel strip assortment (different initial and final thickness, differenthardness, and fluctuation of tandem mill parameters (change of friction coefficient, workrolls abrasion, temperature fluctuation, etc. the exact analysis of tandem is complicated.The paper deals with an identification of friction coefficient on a single rolling mill standby a genetic algorithm. Mathematical description of tandem mill stand is based on themodified Bland-Ford model. Results are presented in graphical form.

  19. A genetic algorithm for finding pulse sequences for NMR quantum computing

    CERN Document Server

    Rethinam, M J; Behrman, E C; Steck, J E; Skinner, S R

    2004-01-01

    We present a genetic algorithm for finding a set of pulse sequences, or rotations, for a given quantum logic gate, as implemented by NMR. We demonstrate the utility of the method by showing that shorter sequences than have been previously published can be found for both a CNOT and for the central part of Shor's algorithm (for N=15.) Artificial intelligence techniques like the genetic algorithm here presented have an enormous potential for simplifying the implementation of working quantum computers.

  20. Aneesur Rahman Prize for Computational Physics Lecture: Photonic Crystals and Genetic Algorithms: Adventures of a Computational Physicist

    Science.gov (United States)

    Ho, Kai Ming

    2012-02-01

    I will review some of our work in the computation of photonic crystals, focusing on our discovery of the photonic band gap in diamond structures. I will also describe our conception of the cut-and-paste genetic algorithm in materials discovery structure search and discuss applications of the algorithm from early studies of atomic clusters geometries to more recent applications for structures of surfaces, interfaces, nanowires, and bulk crystals.

  1. Human-competitive evolution of quantum computing artefacts by Genetic Programming.

    Science.gov (United States)

    Massey, Paul; Clark, John A; Stepney, Susan

    2006-01-01

    We show how Genetic Programming (GP) can be used to evolve useful quantum computing artefacts of increasing sophistication and usefulness: firstly specific quantum circuits, then quantum programs, and finally system-independent quantum algorithms. We conclude the paper by presenting a human-competitive Quantum Fourier Transform (QFT) algorithm evolved by GP.

  2. SAM: The "Search and Match" Computer Program of the Escherichia coli Genetic Stock Center

    Science.gov (United States)

    Bachmann, B. J.; And Others

    1973-01-01

    Describes a computer program used at a genetic stock center to locate particular strains of bacteria. The program can match up to 30 strain descriptions requested by a researcher with the records on file. Uses of this particular program can be made in many fields. (PS)

  3. Computer Simulation of a Microbial Genetics Experiment as a Learning Aid for Undergraduate Teaching.

    Science.gov (United States)

    Day, M. J.; And Others

    1983-01-01

    Reports design of an interactive computer program (FORTRAN) in microbial genetics. The program is divided into three stages: background information, simulation, and data treatment. Results obtained from the simulation allow four genes to be sequenced along the bacterial chromosome. The simulation mimics experimental errors and production of…

  4. GESP: A computer program for modeling genetic effective population size, inbreeding, and divergence in substructured populations.

    Science.gov (United States)

    Olsson, Fredrik; Laikre, Linda; Hössjer, Ola; Ryman, Nils

    2017-03-24

    The genetically effective population size (Ne) is of key importance for quantifying rates of inbreeding and genetic drift, and is often used in conservation management to set targets for genetic viability. The concept was developed for single, isolated populations and the mathematical means for analyzing the expected Ne in complex, subdivided populations have previously not been available. We recently developed such analytical theory and central parts of that work have now been incorporated into a freely available software tool presented here. GESP (Genetic Effective population size, inbreeding, and divergence in Substructured Populations) is R-based and designed to model short and long term patterns of genetic differentiation and effective population size of subdivided populations. The algorithms performed by GESP allow exact computation of global and local inbreeding and eigenvalue effective population size, predictions of genetic divergence among populations (GST) as well as departures from random mating (FIS, FIT) while varying i) subpopulation census and effective size, separately or including trend of the global population size, ii) rate and direction of migration between all pairs of subpopulations, iii) degree of relatedness and divergence among subpopulations, iv) ploidy (haploid or diploid), and v) degree of selfing. Here, we describe GESP and exemplify its use in conservation genetics modeling. This article is protected by copyright. All rights reserved.

  5. Application of computational methods in genetic study of inflammatory bowel disease.

    Science.gov (United States)

    Li, Jin; Wei, Zhi; Hakonarson, Hakon

    2016-01-21

    Genetic factors play an important role in the etiology of inflammatory bowel disease (IBD). The launch of genome-wide association study (GWAS) represents a landmark in the genetic study of human complex disease. Concurrently, computational methods have undergone rapid development during the past a few years, which led to the identification of numerous disease susceptibility loci. IBD is one of the successful examples of GWAS and related analyses. A total of 163 genetic loci and multiple signaling pathways have been identified to be associated with IBD. Pleiotropic effects were found for many of these loci; and risk prediction models were built based on a broad spectrum of genetic variants. Important gene-gene, gene-environment interactions and key contributions of gut microbiome are being discovered. Here we will review the different types of analyses that have been applied to IBD genetic study, discuss the computational methods for each type of analysis, and summarize the discoveries made in IBD research with the application of these methods.

  6. Cuckoo Genetic Optimization Algorithm for Efficient Job Scheduling with Load Balance in Grid Computing

    Directory of Open Access Journals (Sweden)

    Rachhpal Singh

    2016-08-01

    Full Text Available Grid computing incorporates dispersed resources to work out composite technical, industrial, and business troubles. Thus a capable scheduling method is necessary for obtaining the objectives of grid. The disputes of parallel computing are commencing with the computing resources for the number of jobs and intricacy, craving, resource malnourishment, load balancing and efficiency. The risk stumbling upon parallel computing is the enthusiasm to scrutinize different optimization techniques to achieve the tasks without unsafe surroundings. Here Cuckoo Genetic Optimization Algorithm (CGOA is established that was motivated from cuckoo optimization algorithm (COA and genetic algorithm (GA for task scheduling in parallel environment (grid computing system. This CGOA is implemented on parallel dealing out for effective scheduling of multiple tasks with less schedule length and load balance. Here transmission time is evaluated with number of job set. This is computed with the help of job-processor relationship. This technique handles the issues well and the results show that complexity, load balance and resource utilization are finely managed.

  7. Computation of nodal surfaces in fixed-node diffusion Monte Carlo calculations using a genetic algorithm.

    Science.gov (United States)

    Ramilowski, Jordan A; Farrelly, David

    2010-10-21

    The fixed-node diffusion Monte Carlo (DMC) algorithm is a powerful way of computing excited state energies in a remarkably diverse number of contexts in quantum chemistry and physics. The main difficulty in implementing the procedure lies in obtaining a good estimate of the nodal surface of the excited state in question. Although the nodal surface can sometimes be obtained from symmetry or by making approximations this is not always the case. In any event, nodal surfaces are usually obtained in an ad hoc way. In fact, the search for nodal surfaces can be formulated as an optimization problem within the DMC procedure itself. Here we investigate the use of a genetic algorithm to systematically and automatically compute nodal surfaces. Application is made to the computation of excited states of the HCN-(4)He complex and to the computation of tunneling splittings in the hydrogen bonded HCl-HCl complex.

  8. A Resource Scheduling Strategy in Cloud Computing Based on Multi-agent Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Wuxue Jiang

    2013-11-01

    Full Text Available Resource scheduling strategies in cloud computing are used either to improve system operating efficiency, or to improve user satisfaction. This paper presents an integrated scheduling strategy considering both resources credibility and user satisfaction. It takes user satisfaction as objective function and resources credibility as a part of the user satisfaction, and realizes optimal scheduling by using genetic algorithm. We integrate this scheduling strategy into Agent subsequently and propose a cloud computing system architecture based on Multi-agent. The numerical results show that this scheduling strategy improves not only the system operating efficiency, but also the user satisfaction.  

  9. OPTIMIZATION DESIGN OF HYDRAU-LIC MANIFOLD BLOCKS BASED ON HUMAN-COMPUTER COOPERATIVE GENETIC ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Feng Yi; Li Li; Tian Shujun

    2003-01-01

    Optimization design of hydraulic manifold blocks (HMB) is studied as a complex solid spatial layout problem. Based on comprehensive research into structure features and design rules of HMB, an optimal mathematical model for this problem is presented. Using human-computer cooperative genetic algorithm (GA) and its hybrid optimization strategies, integrated layout and connection design schemes of HMB can be automatically optimized. An example is given to testify it.

  10. Effective diagnosis of genetic disease by computational phenotype analysis of the disease-associated genome

    OpenAIRE

    Zemojtel, T.; Koehler, S; Mackenroth, L; Jaeger, M.; Hecht, J.; Krawitz, P.; Graul-Neumann, L; Doelken, S.; Ehmke, N.; Spielmann, M.; Oien, N.C.; Schweiger, M R; Krueger, U; Frommer, G.; Fischer, B.

    2014-01-01

    Less than half of patients with suspected genetic disease receive a molecular diagnosis. We have therefore integrated next-generation sequencing (NGS), bioinformatics, and clinical data into an effective diagnostic workflow. We used variants in the 2741 established Mendelian disease genes [the disease-associated genome (DAG)] to develop a targeted enrichment DAG panel (7.1 Mb), which achieves a coverage of 20-fold or better for 98% of bases. Furthermore, we established a computational method ...

  11. Power-Aware Resource Reconfiguration Using Genetic Algorithm in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Li Deng

    2016-01-01

    Full Text Available Cloud computing enables scalable computation based on virtualization technology. However, current resource reallocation solution seldom considers the stability of virtual machine (VM placement pattern. Varied workloads of applications would lead to frequent resource reconfiguration requirements due to repeated appearance of hot nodes. In this paper, several algorithms for VM placement (multiobjective genetic algorithm (MOGA, power-aware multiobjective genetic algorithm (pMOGA, and enhanced power-aware multiobjective genetic algorithm (EpMOGA are presented to improve stability of VM placement pattern with less migration overhead. The energy consumption is also considered. A type-matching controller is designed to improve evolution process. Nondominated sorting genetic algorithm II (NSGAII is used to select new generations during evolution process. Our simulation results demonstrate that these algorithms all provide resource reallocation solutions with long stabilization time of nodes. pMOGA and EpMOGA also better balance the relationship of stabilization and energy efficiency by adding number of active nodes as one of optimal objectives. Type-matching controller makes EpMOGA superior to pMOGA.

  12. The system of molecular-genetic triggers as self--organizing computing system

    Directory of Open Access Journals (Sweden)

    A. Profir

    2001-05-01

    Full Text Available In this paper is shown, that the system of molecular-genetic triggers can solve the SAT problem. The molecular-genetic trigger represents the self-organizing structure and has attractors. The signal from one attractor is transmitted to other attractor, from the first level to the second level of the system. Molecular-genetic triggers work separately. The system of molecular-genetic triggers represents an example of parallel computing system. Suppose, that the system can receive two types of signals. In the first case, the system switches with the help of signals of a molecular nature (concentration of activators x1, x>sub>2, x3, x4. In the second case, the signals of wave nature of a resonant frequency can be utilized. It is possible to show, that the molecular--genetic system, can recognize images encoded by 2-dimensional vectors. Thus, the cells can be considered as parallel self-organizing system producing, receiving and transmitting the information.

  13. Computer-aided identification of polymorphism sets diagnostic for groups of bacterial and viral genetic variants

    Directory of Open Access Journals (Sweden)

    Huygens Flavia

    2007-08-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs and genes that exhibit presence/absence variation have provided informative marker sets for bacterial and viral genotyping. Identification of marker sets optimised for these purposes has been based on maximal generalized discriminatory power as measured by Simpson's Index of Diversity, or on the ability to identify specific variants. Here we describe the Not-N algorithm, which is designed to identify small sets of genetic markers diagnostic for user-specified subsets of known genetic variants. The algorithm does not treat the user-specified subset and the remaining genetic variants equally. Rather Not-N analysis is designed to underpin assays that provide 0% false negatives, which is very important for e.g. diagnostic procedures for clinically significant subgroups within microbial species. Results The Not-N algorithm has been incorporated into the "Minimum SNPs" computer program and used to derive genetic markers diagnostic for multilocus sequence typing-defined clonal complexes, hepatitis C virus (HCV subtypes, and phylogenetic clades defined by comparative genome hybridization (CGH data for Campylobacter jejuni, Yersinia enterocolitica and Clostridium difficile. Conclusion Not-N analysis is effective for identifying small sets of genetic markers diagnostic for microbial sub-groups. The best results to date have been obtained with CGH data from several bacterial species, and HCV sequence data.

  14. Identifying human disease genes: advances in molecular genetics and computational approaches.

    Science.gov (United States)

    Bakhtiar, S M; Ali, A; Baig, S M; Barh, D; Miyoshi, A; Azevedo, V

    2014-07-04

    The human genome project is one of the significant achievements that have provided detailed insight into our genetic legacy. During the last two decades, biomedical investigations have gathered a considerable body of evidence by detecting more than 2000 disease genes. Despite the imperative advances in the genetic understanding of various diseases, the pathogenesis of many others remains obscure. With recent advances, the laborious methodologies used to identify DNA variations are replaced by direct sequencing of genomic DNA to detect genetic changes. The ability to perform such studies depends equally on the development of high-throughput and economical genotyping methods. Currently, basically for every disease whose origen is still unknown, genetic approaches are available which could be pedigree-dependent or -independent with the capacity to elucidate fundamental disease mechanisms. Computer algorithms and programs for linkage analysis have formed the foundation for many disease gene detection projects, similarly databases of clinical findings have been widely used to support diagnostic decisions in dysmorphology and general human disease. For every disease type, genome sequence variations, particularly single nucleotide polymorphisms are mapped by comparing the genetic makeup of case and control groups. Methods that predict the effects of polymorphisms on protein stability are useful for the identification of possible disease associations, whereas structural effects can be assessed using methods to predict stability changes in proteins using sequence and/or structural information.

  15. Computational fluid dynamics based bulbous bow optimization using a genetic algorithm

    Science.gov (United States)

    Mahmood, Shahid; Huang, Debo

    2012-09-01

    Computational fluid dynamics (CFD) plays a major role in predicting the flow behavior of a ship. With the development of fast computers and robust CFD software, CFD has become an important tool for designers and engineers in the ship industry. In this paper, the hull form of a ship was optimized for total resistance using CFD as a calculation tool and a genetic algorithm as an optimization tool. CFD based optimization consists of major steps involving automatic generation of geometry based on design parameters, automatic generation of mesh, automatic analysis of fluid flow to calculate the required objective/cost function, and finally an optimization tool to evaluate the cost for optimization. In this paper, integration of a genetic algorithm program, written in MATLAB, was carried out with the geometry and meshing software GAMBIT and CFD analysis software FLUENT. Different geometries of additive bulbous bow were incorporated in the original hull based on design parameters. These design variables were optimized to achieve a minimum cost function of "total resistance". Integration of a genetic algorithm with CFD tools proves to be effective for hull form optimization.

  16. Computational Fluid Dynamics Based Bulbous Bow Optimization Using a Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    Shahid Mahmood; Debo Huang

    2012-01-01

    Computational fluid dynamics (CFD) plays a major role in predicting the flow behavior of a ship.With the development of fast computers and robust CFD software,CFD has become an important tool for designers and engineers in the ship industry.In this paper,the hull form of a ship was optimized for total resistance using CFD as a calculation tool and a genetic algorithm as an optimization tool.CFD based optimization consists of major steps involving automatic generation of geometry based on design parameters,automatic generation of mesh,automatic analysis of fluid flow to calculate the required objective/cost function,and finally an optimization tool to evaluate the cost for optimization.In this paper,integration of a genetic algorithm program,written in MATLAB,was carried out with the geometry and meshing software GAMBIT and CFD analysis software FLUENT.Different geometries of additive bulbous bow were incorporated in the original hull based on design parameters.These design variables were optimized to achieve a minimum cost function of “total resistance”.Integration of a genetic algorithm with CFD tools proves to be effective for hull form optimization.

  17. wisepair: a computer program for individual matching in genetic tracking studies.

    Science.gov (United States)

    Rothstein, Andrew P; McLaughlin, Ryan; Acevedo-Gutiérrez, Alejandro; Schwarz, Dietmar

    2017-03-01

    Individual-based data sets tracking organisms over space and time are fundamental to answering broad questions in ecology and evolution. A 'permanent' genetic tag circumvents a need to invasively mark or tag animals, especially if there are little phenotypic differences among individuals. However, genetic tracking of individuals does not come without its limits; correctly matching genotypes and error rates associated with laboratory work can make it difficult to parse out matched individuals. In addition, defining a sampling design that effectively matches individuals in the wild can be a challenge for researchers. Here, we combine the two objectives of defining sampling design and reducing genotyping error through an efficient Python-based computer-modelling program, wisepair. We describe the methods used to develop the computer program and assess its effectiveness through three empirical data sets, with and without reference genotypes. Our results show that wisepair outperformed similar genotype matching programs using previously published from reference genotype data of diurnal poison frogs (Allobates femoralis) and without-reference (faecal) genotype sample data sets of harbour seals (Phoca vitulina) and Eurasian otters (Lutra lutra). In addition, due to limited sampling effort in the harbour seal data, we present optimal sampling designs for future projects. wisepair allows for minimal sacrifice in the available methods as it incorporates sample rerun error data, allelic pairwise comparisons and probabilistic simulations to determine matching thresholds. Our program is the lone tool available to researchers to define parameters a priori for genetic tracking studies.

  18. A multiparametric computational algorithm for comprehensive assessment of genetic mutations in mucopolysaccharidosis type IIIA (Sanfilippo syndrome).

    Science.gov (United States)

    Ugrinov, Krastyu G; Freed, Stefan D; Thomas, Clayton L; Lee, Shaun W

    2015-01-01

    Mucopolysaccharidosis type IIIA (MPS-IIIA, Sanfilippo syndrome) is a Lysosomal Storage Disease caused by cellular deficiency of N-sulfoglucosamine sulfohydrolase (SGSH). Given the large heterogeneity of genetic mutations responsible for the disease, a comprehensive understanding of the mechanisms by which these mutations affect enzyme function is needed to guide effective therapies. We developed a multiparametric computational algorithm to assess how patient genetic mutations in SGSH affect overall enzyme biogenesis, stability, and function. 107 patient mutations for the SGSH gene were obtained from the Human Gene Mutation Database representing all of the clinical mutations documented for Sanfilippo syndrome. We assessed each mutation individually using ten distinct parameters to give a comprehensive predictive score of the stability and misfolding capacity of the SGSH enzyme resulting from each of these mutations. The predictive score generated by our multiparametric algorithm yielded a standardized quantitative assessment of the severity of a given SGSH genetic mutation toward overall enzyme activity. Application of our algorithm has identified SGSH mutations in which enzymatic malfunction of the gene product is specifically due to impairments in protein folding. These scores provide an assessment of the degree to which a particular mutation could be treated using approaches such as chaperone therapies. Our multiparametric protein biogenesis algorithm advances a key understanding in the overall biochemical mechanism underlying Sanfilippo syndrome. Importantly, the design of our multiparametric algorithm can be tailored to many other diseases of genetic heterogeneity for which protein misfolding phenotypes may constitute a major component of disease manifestation.

  19. Impact of computer-assisted data collection, evaluation and management on the cancer genetic counselor's time providing patient care.

    Science.gov (United States)

    Cohen, Stephanie A; McIlvried, Dawn E

    2011-06-01

    Cancer genetic counseling sessions traditionally encompass collecting medical and family history information, evaluating that information for the likelihood of a genetic predisposition for a hereditary cancer syndrome, conveying that information to the patient, offering genetic testing when appropriate, obtaining consent and subsequently documenting the encounter with a clinic note and pedigree. Software programs exist to collect family and medical history information electronically, intending to improve efficiency and simplicity of collecting, managing and storing this data. This study compares the genetic counselor's time spent in cancer genetic counseling tasks in a traditional model and one using computer-assisted data collection, which is then used to generate a pedigree, risk assessment and consult note. Genetic counselor time spent collecting family and medical history and providing face-to-face counseling for a new patient session decreased from an average of 85-69 min when using the computer-assisted data collection. However, there was no statistically significant change in overall genetic counselor time on all aspects of the genetic counseling process, due to an increased amount of time spent generating an electronic pedigree and consult note. Improvements in the computer program's technical design would potentially minimize data manipulation. Certain aspects of this program, such as electronic collection of family history and risk assessment, appear effective in improving cancer genetic counseling efficiency while others, such as generating an electronic pedigree and consult note, do not.

  20. Singlet-state creation and universal quantum computation in NMR using a genetic algorithm

    Science.gov (United States)

    Manu, V. S.; Kumar, Anil

    2012-08-01

    The experimental implementation of a quantum algorithm requires the decomposition of unitary operators. Here we treat unitary-operator decomposition as an optimization problem, and use a genetic algorithm—a global-optimization method inspired by nature's evolutionary process—for operator decomposition. We apply this method to NMR quantum information processing, and find a probabilistic way of performing universal quantum computation using global hard pulses. We also demonstrate the efficient creation of the singlet state (a special type of Bell state) directly from thermal equilibrium, using an optimum sequence of pulses.

  1. Singlet state creation and Universal quantum computation in NMR using Genetic Algorithm

    CERN Document Server

    Manu, V S

    2012-01-01

    Experimental implementation of a quantum algorithm requires unitary operator decomposition. Here we treat the unitary operator decomposition as an optimization problem and use Genetic Algorithm, a global optimization method inspired by nature's evolutionary process for operator decomposition. As an application, we apply this to NMR Quantum Information Processing and find a probabilistic way of doing universal quantum computation using global hard pulses. We also demonstrate efficient creation of singlet state (as a special case of Bell state) directly from thermal equilibrium using an optimum sequence of pulses.

  2. Optimization Method for Turbine Airfoil Designing Using Genetic Algorithms, CFD and Parallel Computing

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    An optimization method to design turbine airfoils using a Genetic Algorithm (GA) design shell coupled directly with a viscous CFD (Computational Fluid Dynamics) analysis code is proposed in this paper. The blade geometry is parameterized and the optimization method is used to search for a blade geometry that will minimize the loss in the turbine cascade passage. The viscous flow prediction code is verified by the experimental data of cascade, which is typical for a gas turbine rotor blade section. A comparative study of the blades designed by the optimization technique and the original one is presented

  3. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    DEFF Research Database (Denmark)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michaël C.;

    2015-01-01

    method allowing us to deal with genomic data sets (several hundred thousands loci). 3. We also illustrate the potential of the method by re-analysing three data sets, namely harbour porpoises in Europe, coyotes in California and herrings in the Baltic Sea. 4. The computer program developed here is freely......1. In a recent paper, Bradburd et al. (Evolution, 67, 2013, 3258) proposed a model to quantify the relative effect of geographic and environmental distance on genetic differentiation. Here, we enhance this method in several ways. 2. We modify the covariance model so as to fit better with mainstream...

  4. Computational thermodynamics, Gaussian processes and genetic algorithms: combined tools to design new alloys

    Science.gov (United States)

    Tancret, F.

    2013-06-01

    A new alloy design procedure is proposed, combining in a single computational tool several modelling and predictive techniques that have already been used and assessed in the field of materials science and alloy design: a genetic algorithm is used to optimize the alloy composition for target properties and performance on the basis of the prediction of mechanical properties (estimated by Gaussian process regression of data on existing alloys) and of microstructural constitution, stability and processability (evaluated by computational themodynamics). These tools are integrated in a unique Matlab programme. An example is given in the case of the design of a new nickel-base superalloy for future power plant applications (such as the ultra-supercritical (USC) coal-fired plant, or the high-temperature gas-cooled nuclear reactor (HTGCR or HTGR), where the selection criteria include cost, oxidation and creep resistance around 750 °C, long-term stability at service temperature, forgeability, weldability, etc.

  5. Genetics

    Science.gov (United States)

    ... Inheritance; Heterozygous; Inheritance patterns; Heredity and disease; Heritable; Genetic markers ... The chromosomes are made up of strands of genetic information called DNA. Each chromosome contains sections of ...

  6. New Genetics

    Science.gov (United States)

    ... Home > Science Education > The New Genetics The New Genetics Living Laboratories Classroom Poster Order a Free Copy ... Piece to a Century-Old Evolutionary Puzzle Computing Genetics Model Organisms RNA Interference The New Genetics is ...

  7. From the genetic to the computer program: the historicity of 'data' and 'computation' in the investigations on the nematode worm C. elegans (1963-1998).

    Science.gov (United States)

    García-Sancho, Miguel

    2012-03-01

    This paper argues that the history of the computer, of the practice of computation and of the notions of 'data' and 'programme' are essential for a critical account of the emergence and implications of data-driven research. In order to show this, I focus on the transition that the investigations on the worm C. elegans experienced in the Laboratory of Molecular Biology of Cambridge (UK). Throughout the 1980s, this research programme evolved from a study of the genetic basis of the worm's development and behaviour to a DNA mapping and sequencing initiative. By examining the changing computing technologies which were used at the Laboratory, I demonstrate that by the time of this transition researchers shifted from modelling the worm's genetic programme on a mainframe apparatus to writing minicomputer programs aimed at providing map and sequence data which was then circulated to other groups working on the genetics of C. elegans. The shift in the worm research should thus not be simply explained in the application of computers which transformed the project from hypothesis-driven to a data-intensive endeavour. The key factor was rather a historically specific technology-in-house and easy programmable minicomputers-which redefined the way of achieving the project's long-standing goal, leading the genetic programme to co-evolve with the practices of data production and distribution.

  8. Simulation of alternative genetic control systems for Aedes aegypti in outdoor cages and with a computer

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, C.F.; Lorimer, N.; Rai, K.S.; Suguna, S.G.; Uppal, D.K.; Kazmi, S.J.; Hallinan, E.; Dietz, K.

    1976-06-01

    Cycling populations of A. aegypti of wild origin were established in outdoor cages. Releases were then made for 32 to 43 days of either males carrying chromosome translocations cr males of the sex ratio distorter type. The translocation caused a maximum of 50% sterility, but this declined rapidly after termination of releases. The distorter males depressed the proportion of females among the pupae produced in the cage to a minimum of 35% and the distortion of sex ratio persisted for 13 weeks following termination of releases. It was possible to simulate the effects of the releases with a computer. Simulations were aslo made of standard release schedules of three types of genetic material. A strain carrying both sex ratio distortion and a translocation gave the most effective population suppression.

  9. Improving Computational Efficiency of Model Predictive Control Genetic Algorithms for Real-Time Decision Support

    Science.gov (United States)

    Minsker, B. S.; Zimmer, A. L.; Ostfeld, A.; Schmidt, A.

    2014-12-01

    Enabling real-time decision support, particularly under conditions of uncertainty, requires computationally efficient algorithms that can rapidly generate recommendations. In this paper, a suite of model predictive control (MPC) genetic algorithms are developed and tested offline to explore their value for reducing CSOs during real-time use in a deep-tunnel sewer system. MPC approaches include the micro-GA, the probability-based compact GA, and domain-specific GA methods that reduce the number of decision variable values analyzed within the sewer hydraulic model, thus reducing algorithm search space. Minimum fitness and constraint values achieved by all GA approaches, as well as computational times required to reach the minimum values, are compared to large population sizes with long convergence times. Optimization results for a subset of the Chicago combined sewer system indicate that genetic algorithm variations with coarse decision variable representation, eventually transitioning to the entire range of decision variable values, are most efficient at addressing the CSO control problem. Although diversity-enhancing micro-GAs evaluate a larger search space and exhibit shorter convergence times, these representations do not reach minimum fitness and constraint values. The domain-specific GAs prove to be the most efficient and are used to test CSO sensitivity to energy costs, CSO penalties, and pressurization constraint values. The results show that CSO volumes are highly dependent on the tunnel pressurization constraint, with reductions of 13% to 77% possible with less conservative operational strategies. Because current management practices may not account for varying costs at CSO locations and electricity rate changes in the summer and winter, the sensitivity of the results is evaluated for variable seasonal and diurnal CSO penalty costs and electricity-related system maintenance costs, as well as different sluice gate constraint levels. These findings indicate

  10. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  11. The genetic architecture of heterochronsy as a quantitative trait: lessons from a computational model.

    Science.gov (United States)

    Sun, Lidan; Sang, Mengmeng; Zheng, Chenfei; Wang, Dongyang; Shi, Hexin; Liu, Kaiyue; Guo, Yanfang; Cheng, Tangren; Zhang, Qixiang; Wu, Rongling

    2017-05-30

    Heterochrony is known as a developmental change in the timing or rate of ontogenetic events across phylogenetic lineages. It is a key concept synthesizing development into ecology and evolution to explore the mechanisms of how developmental processes impact on phenotypic novelties. A number of molecular experiments using contrasting organisms in developmental timing have identified specific genes involved in heterochronic variation. Beyond these classic approaches that can only identify single genes or pathways, quantitative models derived from current next-generation sequencing data serve as a more powerful tool to precisely capture heterochronic variation and systematically map a complete set of genes that contribute to heterochronic processes. In this opinion note, we discuss a computational framework of genetic mapping that can characterize heterochronic quantitative trait loci that determine the pattern and process of development. We propose a unifying model that charts the genetic architecture of heterochrony that perceives and responds to environmental perturbations and evolves over geologic time. The new model may potentially enhance our understanding of the adaptive value of heterochrony and its evolutionary origins, providing a useful context for designing new organisms that can best use future resources. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Computer simulation is an undervalued tool for genetic analysis: a historical view and presentation of SHIMSHON--a Web-based genetic simulation package.

    Science.gov (United States)

    Greenberg, David A

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. Copyright © 2011 S. Karger AG, Basel.

  13. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    Science.gov (United States)

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  14. A comparison of strategies for Markov chain Monte Carlo computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez-Escriche, Noelia; Sorensen, Daniel

    2008-01-01

    In quantitative genetics, Markov chain Monte Carlo (MCMC) methods are indispensable for statistical inference in non-standard models like generalized linear models with genetic random effects or models with genetically structured variance heterogeneity. A particular challenge for MCMC applications...

  15. Coverage planning in computer-assisted ablation based on Genetic Algorithm.

    Science.gov (United States)

    Ren, Hongliang; Guo, Weian; Sam Ge, Shuzhi; Lim, Wancheng

    2014-06-01

    An ablation planning system plays a pivotal role in tumor ablation procedures, as it provides a dry run to guide the surgeons in a complicated anatomical environment. Over-ablation, over-perforation or under-ablation may result in complications during the treatments. An optimal solution is desired to have complete tumor coverage with minimal invasiveness, including minimal number of ablations and minimal number of perforation trajectories. As the planning of tumor ablation is a multi-objective problem, it is challenging to obtain optimal covering solutions based on clinician׳s experiences. Meanwhile, it is effective for computer-assisted systems to decide a set of optimal plans. This paper proposes a novel approach of integrating a computational optimization algorithm into the ablation planning system. The proposed ablation planning system is designed based on the following objectives: to achieve complete tumor coverage and to minimize the number of ablations, number of needle trajectories and over-ablation to the healthy tissue. These objectives are taken into account using a Genetic Algorithm, which is capable of generating feasible solutions within a constrained search space. The candidate ablation plans can be encoded in generations of chromosomes, which subsequently evolve based on a fitness function. In this paper, an exponential weight-criterion fitness function has been designed by incorporating constraint parameters that were reflective of the different objectives. According to the test results, the proposed planner is able to generate the set of optimal solutions for tumor ablation problem, thereby fulfilling the aforementioned multiple objectives.

  16. A genetic-algorithm-based method to find the unitary transformations for any de- sired quantum computation and application to a one-bit oracle decision problem

    OpenAIRE

    Bang, Jeongho; Yoo, Seokwon

    2014-01-01

    We propose a genetic-algorithm-based method to find the unitary transformations for any desired quantum computation. We formulate a simple genetic algorithm by introducing the "genetic parameter vector" of the unitary transformations to be found. In the genetic algorithm process, all components of the genetic parameter vectors are supposed to evolve to the solution parameters of the unitary transformations. We apply our method to find the optimal unitary transformations and to generalize the ...

  17. Inference of Tumor Evolution during Chemotherapy by Computational Modeling and In Situ Analysis of Genetic and Phenotypic Cellular Diversity

    Directory of Open Access Journals (Sweden)

    Vanessa Almendro

    2014-02-01

    Full Text Available Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and posttreatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.

  18. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity.

    Science.gov (United States)

    Almendro, Vanessa; Cheng, Yu-Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege G; Helland, Aslaug; Rye, Inga H; Borresen-Dale, Anne-Lise; Maruyama, Reo; van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin L; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia

    2014-02-13

    Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and posttreatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.

  19. A comparison of strategies for Markov chain Monte Carlo computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez-Escriche, Noelia; Sorensen, Daniel

    2008-01-01

    In quantitative genetics, Markov chain Monte Carlo (MCMC) methods are indispensable for statistical inference in non-standard models like generalized linear models with genetic random effects or models with genetically structured variance heterogeneity. A particular challenge for MCMC applications...... in quantitative genetics is to obtain efficient updates of the high-dimensional vectors of genetic random effects and the associated covariance parameters. We discuss various strategies to approach this problem including reparameterization, Langevin-Hastings updates, and updates based on normal approximations....... The methods are compared in applications to Bayesian inference for three data sets using a model with genetically structured variance heterogeneity...

  20. Rapid genetic algorithm optimization of a mouse computational model: Benefits for anthropomorphization of neonatal mouse cardiomyocytes

    Directory of Open Access Journals (Sweden)

    Corina Teodora Bot

    2012-11-01

    Full Text Available While the mouse presents an invaluable experimental model organism in biology, its usefulness in cardiac arrhythmia research is limited in some aspects due to major electrophysiological differences between murine and human action potentials (APs. As previously described, these species-specific traits can be partly overcome by application of a cell-type transforming clamp (CTC to anthropomorphize the murine cardiac AP. CTC is a hybrid experimental-computational dynamic clamp technique, in which a computationally calculated time-dependent current is inserted into a cell in real time, to compensate for the differences between sarcolemmal currents of that cell (e.g., murine and the desired species (e.g., human. For effective CTC performance, mismatch between the measured cell and a mathematical model used to mimic the measured AP must be minimal. We have developed a genetic algorithm (GA approach that rapidly tunes a mathematical model to reproduce the AP of the murine cardiac myocyte under study. Compared to a prior implementation that used a template-based model selection approach, we show that GA optimization to a cell-specific model results in a much better recapitulation of the desired AP morphology with CTC. This improvement was more pronounced when anthropomorphizing neonatal mouse cardiomyocytes to human-like APs than to guinea pig APs. CTC may be useful for a wide range of applications, from screening effects of pharmaceutical compounds on ion channel activity, to exploring variations in the mouse or human genome. Rapid GA optimization of a cell-specific mathematical model improves CTC performance and may therefore expand the applicability and usage of the CTC technique.

  1. Neural networks with multiple general neuron models: a hybrid computational intelligence approach using Genetic Programming.

    Science.gov (United States)

    Barton, Alan J; Valdés, Julio J; Orchard, Robert

    2009-01-01

    Classical neural networks are composed of neurons whose nature is determined by a certain function (the neuron model), usually pre-specified. In this paper, a type of neural network (NN-GP) is presented in which: (i) each neuron may have its own neuron model in the form of a general function, (ii) any layout (i.e network interconnection) is possible, and (iii) no bias nodes or weights are associated to the connections, neurons or layers. The general functions associated to a neuron are learned by searching a function space. They are not provided a priori, but are rather built as part of an Evolutionary Computation process based on Genetic Programming. The resulting network solutions are evaluated based on a fitness measure, which may, for example, be based on classification or regression errors. Two real-world examples are presented to illustrate the promising behaviour on classification problems via construction of a low-dimensional representation of a high-dimensional parameter space associated to the set of all network solutions.

  2. Computational optimization for S-type biological systems: cockroach genetic algorithm.

    Science.gov (United States)

    Wu, Shinq-Jen; Wu, Cheng-Tao

    2013-10-01

    S-type biological systems (S-systems) are demonstrated to be universal approximations of continuous biological systems. S-systems are easy to be generalized to large systems. The systems are identified through data-driven identification techniques (cluster-based algorithms or computational methods). However, S-systems' identification is challenging because multiple attractors exist in such highly nonlinear systems. Moreover, in some biological systems the interactive effect cannot be neglected even the interaction order is small. Therefore, learning should be focused on increasing the gap between the true and redundant interaction. In addition, a wide searching space is necessary because no prior information is provided. The used technologies should have the ability to achieve convergence enhancement and diversity preservation. Cockroaches live in nearly all habitats and survive for more than 300 million years. In this paper, we mimic cockroaches' competitive swarm behavior and integrated it with advanced evolutionary operations. The proposed cockroach genetic algorithm (CGA) possesses strong snatching-food ability to rush forward to a target and high migration ability to escape from local minimum. CGA was tested with three small-scale systems, a twenty-state medium-scale system and a thirty-state large-scale system. A wide search space ([0,100] for rate constants and [-100,100] for kinetic orders) with random or bad initial starts are used to show the high exploration performance.

  3. MLGA: A SAS Macro to Compute Maximum Likelihood Estimators via Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Francisco Juretig

    2015-08-01

    Full Text Available Nonlinear regression is usually implemented in SAS either by using PROC NLIN or PROC NLMIXED. Apart from the model structure, initial values need to be specified for each parameter. And after some convergence criteria are fulfilled, the second order conditions need to be analyzed. But numerical problems are expected to appear in case the likelihood is nearly discontinuous, has plateaus, multiple maxima, or the initial values are distant from the true parameter estimates. The usual solution consists of using a grid, and then choosing the set of parameters reporting the highest log-likelihood. However, if the amount of parameters or grid points is large, the computational burden will be excessive. Furthermore, there is no guarantee that, as the number of grid points increases, an equal or better set of points will be found. Genetic algorithms can overcome these problems by replicating how nature optimizes its processes. The MLGA macro is presented; it solves a maximum likelihood estimation problem under normality through PROC GA, and the resulting values can later be used as the starting values in SAS nonlinear procedures. As will be demonstrated, this macro can avoid the usual trial and error approach that is needed when convergence problems arise. Finally, it will be shown how this macro can deal with complicated restrictions involving multiple parameters.

  4. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  5. Linking a genetic defect in migraine to spreading depression in a computational model

    Directory of Open Access Journals (Sweden)

    Markus A. Dahlem

    2014-05-01

    Full Text Available Familial hemiplegic migraine (FHM is a rare subtype of migraine with aura. A mutation causing FHM type 3 (FHM3 has been identified in SCN1A encoding the Nav1.1 Na+ channel. This genetic defect affects the inactivation gate. While the Na+ tail currents following voltage steps are consistent with both hyperexcitability and hypoexcitability, in this computational study, we investigate functional consequences beyond these isolated events. Our extended Hodgkin–Huxley framework establishes a connection between genotype and cellular phenotype, i.e., the pathophysiological dynamics that spans over multiple time scales and is relevant to migraine with aura. In particular, we investigate the dynamical repertoire from normal spiking (milliseconds to spreading depression and anoxic depolarization (tens of seconds and show that FHM3 mutations render gray matter tissue more vulnerable to spreading depression despite opposing effects associated with action potential generation. We conclude that the classification in terms of hypoexcitability vs. hyperexcitability is too simple a scheme. Our mathematical analysis provides further basic insight into also previously discussed criticisms against this scheme based on psychophysical and clinical data.

  6. Developmental system at the crossroads of system theory, computer science, and genetic engineering

    CERN Document Server

    Węgrzyn, Stefan; Vidal, Pierre

    1990-01-01

    Many facts were at the origin of the present monograph. The ftrst is the beauty of maple leaves in Quebec forests in Fall. It raised the question: how does nature create and reproduce such beautiful patterns? The second was the reading of A. Lindenmayer's works on L systems. Finally came the discovery of "the secrets of DNA" together with many stimulating ex­ changes with biologists. Looking at such facts from the viewpoint of recursive numerical systems led to devise a simple model based on six elementary operations organized in a generating word, the analog of the program of a computer and of the genetic code of DNA in the cells of a living organism. It turned out that such a model, despite its simplicity, can account for a great number of properties of living organisms, e.g. their hierarchical structure, their ability to regenerate after a trauma, the possibility of cloning, their sensitivity to mutation, their growth, decay and reproduction. The model lends itself to analysis: the knowledge of the genera...

  7. A Computational Study of Genetic Crossover Operators for Multi-Objective Vehicle Routing Problem with Soft Time Windows

    CERN Document Server

    Geiger, Martin Josef

    2008-01-01

    The article describes an investigation of the effectiveness of genetic algorithms for multi-objective combinatorial optimization (MOCO) by presenting an application for the vehicle routing problem with soft time windows. The work is motivated by the question, if and how the problem structure influences the effectiveness of different configurations of the genetic algorithm. Computational results are presented for different classes of vehicle routing problems, varying in their coverage with time windows, time window size, distribution and number of customers. The results are compared with a simple, but effective local search approach for multi-objective combinatorial optimization problems.

  8. Forecasting the EMU inflation rate: Linear econometric vs. non-linear computational models using genetic neural fuzzy systems

    DEFF Research Database (Denmark)

    Kooths, Stefan; Mitze, Timo Friedel; Ringhut, Eric

    2004-01-01

    This paper compares the predictive power of linear econometric and non-linear computational models for forecasting the inflation rate in the European Monetary Union (EMU). Various models of both types are developed using different monetary and real activity indicators. They are compared according...... to a battery of parametric and non-parametric test statistics to measure their performance in one- and four-step ahead forecasts of quarterly data. Using genetic-neural fuzzy systems we find the computational approach superior to some degree and show how to combine both techniques successfully....

  9. Computer-Assisted Drug Design: Genetic Algorithms and Structures of Molecular Clusters of Aromatic Hydrocarbons and Actinomycin D-Deoxyguanosine

    Science.gov (United States)

    Xiao, Yong Liang

    Molecular packing, clustering, and docking computations have been performed by empirical intermolecular energy minimization methods. The main focus of this study is finding a robust global search algorithm to solve intermolecular interaction problems, especially to apply an efficient algorithm to large-scale complex molecular systems such as drug-DNA binding or site selectivity which has increasing importance in drug design and drug discovery. Molecular packing in benzene, naphthalene, and anthracene crystals is analyzed in terms of molecular dimer interaction. Intermolecular energies of the gas dimer molecules are calculated for various intermolecular distances and orientations using empirical potential energy functions. The gas dimers are compared to pairs of molecules extracted from the observed crystal structures. Net atomic charges are obtained by the potential-derived method from 6-31G and 6-31G^{**} level ab initio wavefunctions. A new approach using a genetic algorithm is applied to predict structures of benzene, naphthalene, and anthracene molecular clusters. The computer program GAME (genetic algorithm for minimization of energy) has been developed to obtain the global energy minimum of clusters of dimer, trimer, and tetramer molecules. This test model has been further developed to applications of molecular docking. Docking calculations of deoxyguanosine molecules to actinomycin D were performed successfully to identify the binding sites of the drug molecule, which was revealed by actinomycin D-deoxyguanosine complex from the solved x-ray crystal structure. The comparison between the evolutionary computing method and conventional local optimization methods concluded that genetic algorithms are very competitive when it comes to complex, large-scale optimization. Full power of genetic algorithms can be unveiled in computer-assisted drug design only when the difficulties of including optimized molecular conformation in the algorithm are overcome. These

  10. Genetics

    DEFF Research Database (Denmark)

    Christensen, Kaare; McGue, Matt

    2016-01-01

    The sequenced genomes of individuals aged ≥80 years, who were highly educated, self-referred volunteers and with no self-reported chronic diseases were compared to young controls. In these data, healthy ageing is a distinct phenotype from exceptional longevity and genetic factors that protect...

  11. Literary drafts, genetic criticism and computational technology. The Beckett Digital Manuscript Project

    NARCIS (Netherlands)

    Sichani, Anna-Maria

    2017-01-01

    This article addresses the Beckett Digital Manuscript Project, an evolving project, currently comprising a series of digital genetic editions of Samuel Beckett’s bilingual literary drafts and a digital library. Following the genetic school of editing, the project’s goal is to explore and represent

  12. Computers in Biological Education: Simulation Approaches. Genetics and Evolution. CAL Research Group Technical Report No. 13.

    Science.gov (United States)

    Murphy, P. J.

    Three examples of genetics and evolution simulation concerning Mendelian inheritance, genetic mapping, and natural selection are used to illustrate the use of simulations in modeling scientific/natural processes. First described is the HERED series, which illustrates such phenomena as incomplete dominance, multiple alleles, lethal alleles,…

  13. Applied Computational Electromagnetics Society Journal, Special Issue on Genetic Algorithms / Volume 15, Number 2

    OpenAIRE

    Haupt, Randy; Johnson, J. Michael

    2000-01-01

    The Applied Computational Electromagnetics Society Journal hereinafter known as the ACES Journal is devoted to the exchange of information in computational electromagnetics, to the advancement of the state-of-the-art, and to the promotion of related technical activities.

  14. Computing strong metric dimension of some special classes of graphs by genetic algorithms

    Directory of Open Access Journals (Sweden)

    Kratica Jozef

    2008-01-01

    Full Text Available In this paper we consider the NP-hard problem of determining the strong metric dimension of graphs. The problem is solved by a genetic algorithm that uses binary encoding and standard genetic operators adapted to the problem. This represents the first attempt to solve this problem heuristically. We report experimental results for the two special classes of ORLIB test instances: crew scheduling and graph coloring.

  15. Identifying shared genetic structure patterns among Pacific Northwest forest taxa: insights from use of visualization tools and computer simulations.

    Directory of Open Access Journals (Sweden)

    Mark P Miller

    Full Text Available BACKGROUND: Identifying causal relationships in phylogeographic and landscape genetic investigations is notoriously difficult, but can be facilitated by use of multispecies comparisons. METHODOLOGY/PRINCIPAL FINDINGS: We used data visualizations to identify common spatial patterns within single lineages of four taxa inhabiting Pacific Northwest forests (northern spotted owl: Strix occidentalis caurina; red tree vole: Arborimus longicaudus; southern torrent salamander: Rhyacotriton variegatus; and western white pine: Pinus monticola. Visualizations suggested that, despite occupying the same geographical region and habitats, species responded differently to prevailing historical processes. S. o. caurina and P. monticola demonstrated directional patterns of spatial genetic structure where genetic distances and diversity were greater in southern versus northern locales. A. longicaudus and R. variegatus displayed opposite patterns where genetic distances were greater in northern versus southern regions. Statistical analyses of directional patterns subsequently confirmed observations from visualizations. Based upon regional climatological history, we hypothesized that observed latitudinal patterns may have been produced by range expansions. Subsequent computer simulations confirmed that directional patterns can be produced by expansion events. CONCLUSIONS/SIGNIFICANCE: We discuss phylogeographic hypotheses regarding historical processes that may have produced observed patterns. Inferential methods used here may become increasingly powerful as detailed simulations of organisms and historical scenarios become plausible. We further suggest that inter-specific comparisons of historical patterns take place prior to drawing conclusions regarding effects of current anthropogenic change within landscapes.

  16. Development of E-Info geneca: a website providing computer-tailored information and question prompt prior to breast cancer genetic counseling.

    NARCIS (Netherlands)

    Albada, A.; Dulmen, S. van; Otten, R.; Bensing, J.M.; Ausems, M.G.E.M.

    2009-01-01

    This article describes the stepwise development of the website ‘E-info geneca’. The website provides counselees in breast cancer genetic counseling with computer-tailored information and a question prompt prior to their first consultation. Counselees generally do not know what to expect from genetic

  17. Factors influencing QTL mapping accuracy under complicated genetic models by computer simulation.

    Science.gov (United States)

    Su, C F; Wang, W; Gong, S L; Zuo, J H; Li, S J

    2016-12-19

    The accuracy of quantitative trait loci (QTLs) identified using different sample sizes and marker densities was evaluated in different genetic models. Model I assumed one additive QTL; Model II assumed three additive QTLs plus one pair of epistatic QTLs; and Model III assumed two additive QTLs with opposite genetic effects plus two pairs of epistatic QTLs. Recombinant inbred lines (RILs) (50-1500 samples) were simulated according to the Models to study the influence of different sample sizes under different genetic models on QTL mapping accuracy. RILs with 10-100 target chromosome markers were simulated according to Models I and II to evaluate the influence of marker density on QTL mapping accuracy. Different marker densities did not significantly influence accurate estimation of genetic effects with simple additive models, but influenced QTL mapping accuracy in the additive and epistatic models. The optimum marker density was approximately 20 markers when the recombination fraction between two adjacent markers was 0.056 in the additive and epistatic models. A sample size of 150 was sufficient for detecting simple additive QTLs. Thus, a sample size of approximately 450 is needed to detect QTLs with additive and epistatic models. Sample size must be approximately 750 to detect QTLs with additive, epistatic, and combined effects between QTLs. The sample size should be increased to >750 if the genetic models of the data set become more complicated than Model III. Our results provide a theoretical basis for marker-assisted selection breeding and molecular design breeding.

  18. Exploring prospective secondary science teachers' understandings of scientific inquiry and Mendelian genetics concepts using computer simulation

    Science.gov (United States)

    Cakir, Mustafa

    The primary objective of this case study was to examine prospective secondary science teachers' developing understanding of scientific inquiry and Mendelian genetics. A computer simulation of basic Mendelian inheritance processes (Catlab) was used in combination with small-group discussions and other instructional scaffolds to enhance prospective science teachers' understandings. The theoretical background for this research is derived from a social constructivist perspective. Structuring scientific inquiry as investigation to develop explanations presents meaningful context for the enhancement of inquiry abilities and understanding of the science content. The context of the study was a teaching and learning course focused on inquiry and technology. Twelve prospective science teachers participated in this study. Multiple data sources included pre- and post-module questionnaires of participants' view of scientific inquiry, pre-posttests of understandings of Mendelian concepts, inquiry project reports, class presentations, process videotapes of participants interacting with the simulation, and semi-structured interviews. Seven selected prospective science teachers participated in in-depth interviews. Findings suggest that while studying important concepts in science, carefully designed inquiry experiences can help prospective science teachers to develop an understanding about the types of questions scientists in that field ask, the methodological and epistemological issues that constrain their pursuit of answers to those questions, and the ways in which they construct and share their explanations. Key findings included prospective teachers' initial limited abilities to create evidence-based arguments, their hesitancy to include inquiry in their future teaching, and the impact of collaboration on thinking. Prior to this experience the prospective teachers held uninformed views of scientific inquiry. After the module, participants demonstrated extended expertise in

  19. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come for p...

  20. Interactive computer program for learning genetic principles of segregation and independent assortment through meiosis.

    Science.gov (United States)

    Yang, Xiaoli; Ge, Rong; Yang, Yufei; Shen, Hao; Li, Yingjie; Tseng, Charles C

    2009-01-01

    Teaching fundamental principles of genetics such as segregation and independent assortment of genes could be challenging for high school and college biology instructors. Students without thorough knowledge in meiosis often end up of frustration and failure in genetics courses. Although all textbooks and laboratory manuals have excellent graphic demonstrations and photographs of meiotic process, students may not always master the concept due to the lack of hands-on exercise. In response to the need for an effective lab exercise to understand the segregation of allelic genes and the independent assortment of the unlinked genes, we developed an interactive program for students to manually manipulate chromosome models and visualize each major step of meiosis so that these two genetic principles can be thoroughly understood.

  1. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    NARCIS (Netherlands)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michael Christophe; Guillot, Gilles

    2015-01-01

    In a recent paper, Bradburd et al. (2013) proposed a model to quantify the relative effect ofgeographic and environmental distance on genetic differentiation. Here, we enhance this method in several ways. 1. We modify the covariance model so as to fit better with mainstream geostatistical models and

  2. EvoluZion: A Computer Simulator for Teaching Genetic and Evolutionary Concepts

    Science.gov (United States)

    Zurita, Adolfo R.

    2017-01-01

    EvoluZion is a forward-in-time genetic simulator developed in Java and designed to perform real time simulations on the evolutionary history of virtual organisms. These model organisms harbour a set of 13 genes that codify an equal number of phenotypic features. These genes change randomly during replication, and mutant genes can have null,…

  3. Genetic algorithms and Markov Chain Monte Carlo: Differential Evolution Markov Chain makes Bayesian computing easy

    NARCIS (Netherlands)

    Braak, ter C.J.F.

    2004-01-01

    Differential Evolution (DE) is a simple genetic algorithm for numerical optimization in real parameter spaces. In a statistical context one would not just want the optimum but also its uncertainty. The uncertainty distribution can be obtained by a Bayesian analysis (after specifying prior and likeli

  4. Recent developments in computer modeling add ecological realism to landscape genetics

    Science.gov (United States)

    Background / Question / Methods A factor limiting the rate of progress in landscape genetics has been the shortage of spatial models capable of linking life history attributes such as dispersal behavior to complex dynamic landscape features. The recent development of new models...

  5. BengaSaVex: A new computational genetic sequence extraction tool ...

    African Journals Online (AJOL)

    SAM

    2014-05-21

    May 21, 2014 ... analysis. Computational biology aids this pruning process by providing computerized tools to ... developed and applied to different gene sequences. They ..... sequences with potential to form complex secondary structures.

  6. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    Science.gov (United States)

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10(4) up to 10(8) or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10(5) permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  7. Computational and genetic reduction of a cell cycle to its simplest, primordial components.

    Directory of Open Access Journals (Sweden)

    Seán M Murray

    2013-12-01

    Full Text Available What are the minimal requirements to sustain an asymmetric cell cycle? Here we use mathematical modelling and forward genetics to reduce an asymmetric cell cycle to its simplest, primordial components. In the Alphaproteobacterium Caulobacter crescentus, cell cycle progression is believed to be controlled by a cyclical genetic circuit comprising four essential master regulators. Unexpectedly, our in silico modelling predicted that one of these regulators, GcrA, is in fact dispensable. We confirmed this experimentally, finding that ΔgcrA cells are viable, but slow-growing and elongated, with the latter mostly due to an insufficiency of a key cell division protein. Furthermore, suppressor analysis showed that another cell cycle regulator, the methyltransferase CcrM, is similarly dispensable with simultaneous gcrA/ccrM disruption ameliorating the cytokinetic and growth defect of ΔgcrA cells. Within the Alphaproteobacteria, gcrA and ccrM are consistently present or absent together, rather than either gene being present alone, suggesting that gcrA/ccrM constitutes an independent, dispensable genetic module. Together our approaches unveil the essential elements of a primordial asymmetric cell cycle that should help illuminate more complex cell cycles.

  8. The application of computer-based tools in obtaining the genetic family history.

    Science.gov (United States)

    Giovanni, Monica A; Murray, Michael F

    2010-07-01

    Family health history is both an adjunct to and a focus of current genetic research, having long been known to be a powerful predictor of individual disease risk. As such, it has been primarily used as a proxy for genetic information. Over the past decade, new roles for family history have emerged, perhaps most importantly as a primary tool for guiding decision-making on the use of expensive genetic testing. The collection of family history information is an important but time-consuming process. Efforts to engage the patient or research subject in preliminary data collection have the potential to improve data accuracy and allow clinicians and researchers more time for analytic tasks. The U.S. Surgeon General, the Centers for Disease Control and Prevention (CDC), and others have developed tools for electronic family history collection. This unit describes the utility of the Web-based My Family Health Portrait (https://familyhistory.hhs.gov) as the prototype for patient-entered family history.

  9. Computational Simulation and Analysis of Mutations: Nucleotide Fixation, Allelic Age and Rare Genetic Variations in Population

    Science.gov (United States)

    Qiu, Shuhao

    2015-01-01

    In order to investigate the complexity of mutations, a computational approach named Genome Evolution by Matrix Algorithms ("GEMA") has been implemented. GEMA models genomic changes, taking into account hundreds of mutations within each individual in a population. By modeling of entire human chromosomes, GEMA precisely mimics real…

  10. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    DEFF Research Database (Denmark)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michaël C.

    2015-01-01

    that allows users to assess which model (e.g. with or without an environment effect) is most suited, (iv) we extend the program to handle several environmental variables jointly, (v) we code all our MCMC algorithms in a mix of compiled languages which allows us to decrease computing time by at least one order...

  11. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    DEFF Research Database (Denmark)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michaël C.

    2015-01-01

    procedure that allows users to assess which model (e.g. with or without an environment effect) is most suited. We code all our MCMC algorithms in a mix of compiled languages which allows us to decrease computing time by at least one order of magnitude. We propose an approximate inference and model selection...

  12. Computational Simulation and Analysis of Mutations: Nucleotide Fixation, Allelic Age and Rare Genetic Variations in Population

    Science.gov (United States)

    Qiu, Shuhao

    2015-01-01

    In order to investigate the complexity of mutations, a computational approach named Genome Evolution by Matrix Algorithms ("GEMA") has been implemented. GEMA models genomic changes, taking into account hundreds of mutations within each individual in a population. By modeling of entire human chromosomes, GEMA precisely mimics real…

  13. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly refle

  14. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  15. Orbit computation of the TELECOM-2D satellite with a Genetic Algorithm

    Science.gov (United States)

    Deleflie, Florent; Coulot, David; Vienne, Alain; Decosta, Romain; Richard, Pascal; Lasri, Mohammed Amjad

    2014-07-01

    In order to test a preliminary orbit determination method, we fit an orbit of the geostationary satellite TELECOM-2D, as if we did not know any a priori information on its trajectory. The method is based on a genetic algorithm coupled to an analytical propagator of the trajectory, that is used over a couple of days, and that uses a whole set of altazimutal data that are acquired by the tracking network made up of the two TAROT telescopes. The adjusted orbit is then compared to a numerical reference. The method is described, and the results are analyzed, as a step towards an operational method of preliminary orbit determination for uncatalogued objects.

  16. A Robust Computational Technique for Model Order Reduction of Two-Time-Scale Discrete Systems via Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Othman M. K. Alsmadi

    2015-01-01

    Full Text Available A robust computational technique for model order reduction (MOR of multi-time-scale discrete systems (single input single output (SISO and multi-input multioutput (MIMO is presented in this paper. This work is motivated by the singular perturbation of multi-time-scale systems where some specific dynamics may not have significant influence on the overall system behavior. The new approach is proposed using genetic algorithms (GA with the advantage of obtaining a reduced order model, maintaining the exact dominant dynamics in the reduced order, and minimizing the steady state error. The reduction process is performed by obtaining an upper triangular transformed matrix of the system state matrix defined in state space representation along with the elements of B, C, and D matrices. The GA computational procedure is based on maximizing the fitness function corresponding to the response deviation between the full and reduced order models. The proposed computational intelligence MOR method is compared to recently published work on MOR techniques where simulation results show the potential and advantages of the new approach.

  17. A robust computational technique for model order reduction of two-time-scale discrete systems via genetic algorithms.

    Science.gov (United States)

    Alsmadi, Othman M K; Abo-Hammour, Zaer S

    2015-01-01

    A robust computational technique for model order reduction (MOR) of multi-time-scale discrete systems (single input single output (SISO) and multi-input multioutput (MIMO)) is presented in this paper. This work is motivated by the singular perturbation of multi-time-scale systems where some specific dynamics may not have significant influence on the overall system behavior. The new approach is proposed using genetic algorithms (GA) with the advantage of obtaining a reduced order model, maintaining the exact dominant dynamics in the reduced order, and minimizing the steady state error. The reduction process is performed by obtaining an upper triangular transformed matrix of the system state matrix defined in state space representation along with the elements of B, C, and D matrices. The GA computational procedure is based on maximizing the fitness function corresponding to the response deviation between the full and reduced order models. The proposed computational intelligence MOR method is compared to recently published work on MOR techniques where simulation results show the potential and advantages of the new approach.

  18. A GENETIC ALGORITHM FOR CONSTRUCTING BROADCAST TREES WITH COST AND DELAY CONSTRAINTS IN COMPUTER NETWORKS

    Directory of Open Access Journals (Sweden)

    Ahmed Y. Hamed

    2015-01-01

    Full Text Available We refer to the problem of constructing broadcast trees with cost and delay constraints in the networks as a delay-constrained minimum spanning tree problem in directed networks. Hence it is necessary determining a spanning tree of minimal cost to connect the source node to all nodes subject to delay constraints on broadcast routing. In this paper, we proposed a genetic algorithm for solving broadcast routing by finding the low-cost broadcast tree with minimum cost and delay constraints. In this research we present a genetic algorithm to find the broadcast routing tree of a given network in terms of its links. The algorithm uses the connection matrix of the given network to find the spanning trees and considers the weights of the links to obtain the minimum spanning tree. Our proposed algorithm is able to find a better solution, fast convergence speed and high reliability. The scalability and the performance of the algorithm with increasing number of network nodes are also encouraging.

  19. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan;

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...... diagnoses. The final result is a short but robust rule based classification scheme, achieving high degree of classification accuracy (exceeding 90% of accuracy for most classes) in a meaningful and user-friendly representation form for the medical expert. The domain of application analyzed through the paper...... is the well-known Pap-Test problem, corresponding to a numerical database, which consists of 450 medical records, 25 diagnostic attributes and 5 different diagnostic classes. Experimental data are divided in two equal parts for the training and testing phase, and 8 mutually dependent rules for diagnosis...

  20. Optimization of beam angles for intensity modulated radiation therapy treatment planning using genetic algorithm on a distributed computing platform.

    Science.gov (United States)

    Nazareth, Daryl P; Brunner, Stephen; Jones, Matthew D; Malhotra, Harish K; Bakhtiari, Mohammad

    2009-07-01

    Planning intensity modulated radiation therapy (IMRT) treatment involves selection of several angle parameters as well as specification of structures and constraints employed in the optimization process. Including these parameters in the combinatorial search space vastly increases the computational burden, and therefore the parameter selection is normally performed manually by a clinician, based on clinical experience. We have investigated the use of a genetic algorithm (GA) and distributed-computing platform to optimize the gantry angle parameters and provide insight into additional structures, which may be necessary, in the dose optimization process to produce optimal IMRT treatment plans. For an IMRT prostate patient, we produced the first generation of 40 samples, each of five gantry angles, by selecting from a uniform random distribution, subject to certain adjacency and opposition constraints. Dose optimization was performed by distributing the 40-plan workload over several machines running a commercial treatment planning system. A score was assigned to each resulting plan, based on how well it satisfied clinically-relevant constraints. The second generation of 40 samples was produced by combining the highest-scoring samples using techniques of crossover and mutation. The process was repeated until the sixth generation, and the results compared with a clinical (equally-spaced) gantry angle configuration. In the sixth generation, 34 of the 40 GA samples achieved better scores than the clinical plan, with the best plan showing an improvement of 84%. Moreover, the resulting configuration of beam angles tended to cluster toward the patient's sides, indicating where the inclusion of additional structures in the dose optimization process may avoid dose hot spots. Additional parameter selection in IMRT leads to a large-scale computational problem. We have demonstrated that the GA combined with a distributed-computing platform can be applied to optimize gantry angle

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. A Quantitative Volumetric Micro-Computed Tomography Method to Analyze Lung Tumors in Genetically Engineered Mouse Models

    Directory of Open Access Journals (Sweden)

    Brian B. Haines

    2009-01-01

    Full Text Available Two genetically engineered, conditional mouse models of lung tumor formation, K-rasLSL-G12D and K-rasLSL-G12D/p53LSL-R270H, are commonly used to model human lung cancer. Developed by Tyler Jacks and colleagues, these models have been invaluable to study in vivo lung cancer initiation and progression in a genetically and physiologically relevant context. However, heterogeneity, multiplicity and complexity of tumor formation in these models make it challenging to monitor tumor growth in vivo and have limited the application of these models in oncology drug discovery. Here, we describe a novel analytical method to quantitatively measure total lung tumor burden in live animals using micro-computed tomography imaging. Applying this methodology, we studied the kinetics of tumor development and response to targeted therapy in vivo in K-ras and K-ras/p53 mice. Consistent with previous reports, lung tumors in both models developed in a time- and dose (Cre recombinase-dependent manner. Furthermore, the compound K-rasLSL-G12D/p53LSL-R270H mice developed tumors faster and more robustly than mice harboring a single K-rasLSL-G12D oncogene, as expected. Erlotinib, a small molecule inhibitor of the epidermal growth factor receptor, significantly inhibited tumor growth in K-rasLSL-G12D/p53LSL-R270H mice. These results demonstrate that this novel imaging technique can be used to monitor both tumor progression and response to treatment and therefore supports a broader application of these genetically engineered mouse models in oncology drug discovery and development.

  3. Computational approaches for the genetic and phenotypic characterization of a Saccharomyces cerevisiae wine yeast collection.

    Science.gov (United States)

    Franco-Duarte, R; Umek, L; Zupan, B; Schuller, D

    2009-12-01

    Within this study, we have used a set of computational techniques to relate the genotypes and phenotypes of natural populations of Saccharomyces cerevisiae, using allelic information from 11 microsatellite loci and results from 24 phenotypic tests. A group of 103 strains was obtained from a larger S. cerevisiae winemaking strain collection by clustering with self-organizing maps. These strains were further characterized regarding their allelic combinations for 11 microsatellites and analysed in phenotypic screens that included taxonomic criteria (carbon and nitrogen assimilation tests, growth at different temperatures) and tests with biotechnological relevance (ethanol resistance, H(2)S or aromatic precursors formation). Phenotypic variability was rather high and each strain showed a unique phenotypic profile. The results, expressed as optical density (A(640)) after 22 h of growth, were in agreement with taxonomic data, although with some exceptions, since few strains were capable of consuming arabinose and ribose to a small extent. Based on microsatellite allelic information, naïve Bayesian classifier correctly assigned (AUC = 0.81, p 0.75). Subgroups were found for strains with low ethanol resistance, growth at 30 degrees C and growth in media containing galactose, raffinose or urea. The results demonstrate that computational approaches can be used to establish genotype-phenotype relations and to make predictions about a strain's biotechnological potential.

  4. Transformation of personal computers and mobile phones into genetic diagnostic systems.

    Science.gov (United States)

    Walker, Faye M; Ahmad, Kareem M; Eisenstein, Michael; Soh, H Tom

    2014-09-16

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone--devices that have become readily accessible in developing countries--into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite.

  5. A genetic-algorithm-based method to find unitary transformations for any desired quantum computation and application to a one-bit oracle decision problem

    Science.gov (United States)

    Bang, Jeongho; Yoo, Seokwon

    2014-12-01

    We propose a genetic-algorithm-based method to find the unitary transformations for any desired quantum computation. We formulate a simple genetic algorithm by introducing the "genetic parameter vector" of the unitary transformations to be found. In the genetic algorithm process, all components of the genetic parameter vectors are supposed to evolve to the solution parameters of the unitary transformations. We apply our method to find the optimal unitary transformations and to generalize the corresponding quantum algorithms for a realistic problem, the one-bit oracle decision problem, or the often-called Deutsch problem. By numerical simulations, we can faithfully find the appropriate unitary transformations to solve the problem by using our method. We analyze the quantum algorithms identified by the found unitary transformations and generalize the variant models of the original Deutsch's algorithm.

  6. A genetic-algorithm-based method to find unitary transformations for any desired quantum computation and application to a one-bit oracle decision problem

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Jeongho [Seoul National University, Seoul (Korea, Republic of); Hanyang University, Seoul (Korea, Republic of); Yoo, Seokwon [Hanyang University, Seoul (Korea, Republic of)

    2014-12-15

    We propose a genetic-algorithm-based method to find the unitary transformations for any desired quantum computation. We formulate a simple genetic algorithm by introducing the 'genetic parameter vector' of the unitary transformations to be found. In the genetic algorithm process, all components of the genetic parameter vectors are supposed to evolve to the solution parameters of the unitary transformations. We apply our method to find the optimal unitary transformations and to generalize the corresponding quantum algorithms for a realistic problem, the one-bit oracle decision problem, or the often-called Deutsch problem. By numerical simulations, we can faithfully find the appropriate unitary transformations to solve the problem by using our method. We analyze the quantum algorithms identified by the found unitary transformations and generalize the variant models of the original Deutsch's algorithm.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. Computer aided decision making for heart disease detection using hybrid neural network-Genetic algorithm.

    Science.gov (United States)

    Arabasadi, Zeinab; Alizadehsani, Roohallah; Roshanzamir, Mohamad; Moosaei, Hossein; Yarifard, Ali Asghar

    2017-04-01

    Cardiovascular disease is one of the most rampant causes of death around the world and was deemed as a major illness in Middle and Old ages. Coronary artery disease, in particular, is a widespread cardiovascular malady entailing high mortality rates. Angiography is, more often than not, regarded as the best method for the diagnosis of coronary artery disease; on the other hand, it is associated with high costs and major side effects. Much research has, therefore, been conducted using machine learning and data mining so as to seek alternative modalities. Accordingly, we herein propose a highly accurate hybrid method for the diagnosis of coronary artery disease. As a matter of fact, the proposed method is able to increase the performance of neural network by approximately 10% through enhancing its initial weights using genetic algorithm which suggests better weights for neural network. Making use of such methodology, we achieved accuracy, sensitivity and specificity rates of 93.85%, 97% and 92% respectively, on Z-Alizadeh Sani dataset.

  11. Computing patient data in the cloud: practical and legal considerations for genetics and genomics research in Europe and internationally.

    Science.gov (United States)

    Molnár-Gábor, Fruzsina; Lueck, Rupert; Yakneen, Sergei; Korbel, Jan O

    2017-06-20

    Biomedical research is becoming increasingly large-scale and international. Cloud computing enables the comprehensive integration of genomic and clinical data, and the global sharing and collaborative processing of these data within a flexibly scalable infrastructure. Clouds offer novel research opportunities in genomics, as they facilitate cohort studies to be carried out at unprecedented scale, and they enable computer processing with superior pace and throughput, allowing researchers to address questions that could not be addressed by studies using limited cohorts. A well-developed example of such research is the Pan-Cancer Analysis of Whole Genomes project, which involves the analysis of petabyte-scale genomic datasets from research centers in different locations or countries and different jurisdictions. Aside from the tremendous opportunities, there are also concerns regarding the utilization of clouds; these concerns pertain to perceived limitations in data security and protection, and the need for due consideration of the rights of patient donors and research participants. Furthermore, the increased outsourcing of information technology impedes the ability of researchers to act within the realm of existing local regulations owing to fundamental differences in the understanding of the right to data protection in various legal systems. In this Opinion article, we address the current opportunities and limitations of cloud computing and highlight the responsible use of federated and hybrid clouds that are set up between public and private partners as an adequate solution for genetics and genomics research in Europe, and under certain conditions between Europe and international partners. This approach could represent a sensible middle ground between fragmented individual solutions and a "one-size-fits-all" approach.

  12. Semi-automatic classification of skeletal morphology in genetically altered mice using flat-panel volume computed tomography.

    Directory of Open Access Journals (Sweden)

    Christian Dullin

    2007-07-01

    Full Text Available Rapid progress in exploring the human and mouse genome has resulted in the generation of a multitude of mouse models to study gene functions in their biological context. However, effective screening methods that allow rapid noninvasive phenotyping of transgenic and knockout mice are still lacking. To identify murine models with bone alterations in vivo, we used flat-panel volume computed tomography (fpVCT for high-resolution 3-D imaging and developed an algorithm with a computational intelligence system. First, we tested the accuracy and reliability of this approach by imaging discoidin domain receptor 2- (DDR2- deficient mice, which display distinct skull abnormalities as shown by comparative landmark-based analysis. High-contrast fpVCT data of the skull with 200 microm isotropic resolution and 8-s scan time allowed segmentation and computation of significant shape features as well as visualization of morphological differences. The application of a trained artificial neuronal network to these datasets permitted a semi-automatic and highly accurate phenotype classification of DDR2-deficient compared to C57BL/6 wild-type mice. Even heterozygous DDR2 mice with only subtle phenotypic alterations were correctly determined by fpVCT imaging and identified as a new class. In addition, we successfully applied the algorithm to classify knockout mice lacking the DDR1 gene with no apparent skull deformities. Thus, this new method seems to be a potential tool to identify novel mouse phenotypes with skull changes from transgenic and knockout mice on the basis of random mutagenesis as well as from genetic models. However for this purpose, new neuronal networks have to be created and trained. In summary, the combination of fpVCT images with artificial neuronal networks provides a reliable, novel method for rapid, cost-effective, and noninvasive primary screening tool to detect skeletal phenotypes in mice.

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  14. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  4. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  6. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  9. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. MixFit: Methodology for Computing Ancestry-Related Genetic Scores at the Individual Level and Its Application to the Estonian and Finnish Population Studies

    Science.gov (United States)

    Leitsalu, Liis; Fischer, Krista; Nuotio, Marja-Liisa; Esko, Tõnu; Boomsma, Dorothea Irene; Kyvik, Kirsten Ohm; Spector, Tim D.; Perola, Markus; Metspalu, Andres

    2017-01-01

    Ancestry information at the individual level can be a valuable resource for personalized medicine, medical, demographical and history research, as well as for tracing back personal history. We report a new method for quantitatively determining personal genetic ancestry based on genome-wide data. Numerical ancestry component scores are assigned to individuals based on comparisons with reference populations. These comparisons are conducted with an existing analytical pipeline making use of genotype phasing, similarity matrix computation and our addition—multidimensional best fitting by MixFit. The method is demonstrated by studying Estonian and Finnish populations in geographical context. We show the main differences in the genetic composition of these otherwise close European populations and how they have influenced each other. The components of our analytical pipeline are freely available computer programs and scripts one of which was developed in house (available at: www.geenivaramu.ee/en/tools/mixfit). PMID:28107396

  14. Genetic similarity of polyploids - A new version of the computer program POPDIST (ver. 1.2.0) considers intraspecific genetic differentiation

    DEFF Research Database (Denmark)

    Tomiuk, Jürgen; Guldbrandtsen, Bernt; Loeschcke, Volker

    2009-01-01

    For evolutionary studies of polyploid species estimates of the genetic identity between species with different degrees of ploidy are particularly required because gene counting in samples of polyploid individuals often cannot be done, e.g., in triploids the phenotype AB can be genotypically eithe...

  15. The perfect neuroimaging-genetics-computation storm: collision of petabytes of data, millions of hardware devices and thousands of software tools.

    Science.gov (United States)

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M; Van Horn, John D; Toga, Arthur W

    2014-06-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data.

  16. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  17. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

    NARCIS (Netherlands)

    Almendro, Vanessa; Cheng, Yu-Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege G; Helland, Aslaug; Rye, Inga H; Borresen-Dale, Anne-Lise; Maruyama, Reo; van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin L; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia

    2014-01-01

    Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-

  18. An Examination of Problem-Based Teaching and Learning in Population Genetics and Evolution Using EVOLVE, a Computer Simulation.

    Science.gov (United States)

    Soderberg, Patti; Price, Frank

    2003-01-01

    Examines a lesson in which students are engaged in inquiry in evolutionary biology to develop better understanding of concepts and reasoning skills necessary to support knowledge claims about changes in the genetic structure of populations known as microevolution. Explains how a software simulation, EVOLVE, can be used to foster discussions about…

  19. Computational discovery of pathway-level genetic vulnerabilities in non-small-cell lung cancer | Office of Cancer Genomics

    Science.gov (United States)

    Novel approaches are needed for discovery of targeted therapies for non-small-cell lung cancer (NSCLC) that are specific to certain patients. Whole genome RNAi screening of lung cancer cell lines provides an ideal source for determining candidate drug targets. Unsupervised learning algorithms uncovered patterns of differential vulnerability across lung cancer cell lines to loss of functionally related genes. Such genetic vulnerabilities represent candidate targets for therapy and are found to be involved in splicing, translation and protein folding.

  20. Software For Genetic Algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  1. Applying genetic algorithms in a parallel computing environment for optimising parameters of complex cellular automata models: the case of SCIDDICA S3hex

    Science.gov (United States)

    D'Ambrosio, D.; Iovine, G.

    2003-04-01

    Cellular Automata (CA) offer a valid alternative to the classic approach, based on partial differential equation, in order to simulate complex phenomena, when these latter can be described in terms of local interactions among their constituent parts. SCIDDICA S3hex is a two-dimensional hexagonal CA model developed for simulating debris flows: it has recently been applied to several real cases of landslides occurred in Campania (Southern Italy). The release S3hex has been derived by progressively improving an initial simplified CA model, originally derived for simulating simple cases of flow-type landslides. The model requires information related to topography, thickness of erodable regolith overlying the bedrock, and location and extension of landslide sources. Performances depend on a set of global parameters which are utilised in the transition function of the model: their value affect the elementary processes of the transition function and thus the overall results. A fine calibration is therefore an essential phase, in order to evaluate the reliability of the model for successive applications to debris-flow susceptibility zonation. The complexity of both the model and the phenomena to be simulated suggested to employ an automated technique of evaluation, for the determination of the best set of global parameters. Genetic Algorithms (GA) are a powerful optimization tool inspired to natural selection. In the last decades, in spite of their intrinsic simplicity, they have largely been successfully applied on a wide number of highly complex problems. The calibration of the model could therefore be performed through such technique of optimisation, by considering several real cases of study. Owing to the large number of simulations generally needed for performing GA experiments on complex phenomena, which imply long lasting tests on sequential computational architectures, the adoption of a parallel computational environment seemed appropriate: the original source code

  2. A genetic engineering approach to genetic algorithms.

    Science.gov (United States)

    Gero, J S; Kazakov, V

    2001-01-01

    We present an extension to the standard genetic algorithm (GA), which is based on concepts of genetic engineering. The motivation is to discover useful and harmful genetic materials and then execute an evolutionary process in such a way that the population becomes increasingly composed of useful genetic material and increasingly free of the harmful genetic material. Compared to the standard GA, it provides some computational advantages as well as a tool for automatic generation of hierarchical genetic representations specifically tailored to suit certain classes of problems.

  3. On the way to building an integrated computational environment for the study of developmental patterns and genetic diseases.

    Science.gov (United States)

    Turinsky, Andrei L; Sensen, Christoph W

    2006-01-01

    Genetic diseases and developmental patterns should be studied on several levels: from macroscale (organs and tissues) to nanoscale (cells, genes, proteins). Due to the overwhelming complexity of the life science data, it is common that disparate data pieces are meticulously stored but never fully analyzed or correlated. We have begun to develop a novel methodology based on virtual reality techniques for the study of these phenomena. Our key approach to knowledge integration is a top-down mapping of data onto visual contexts. For each organism that we want to study, a structural model is created and used as a visual "wireframe" onto which other data types are superimposed in a top-down assembly. Data analysis tools, visual controls, and queries are enabled so that users can interactively explore data. Our visualization technology gives users an opportunity to map disparate data onto a common model, and search visually for hitherto unknown patterns and correlations contained within the data. It is our goal to eventually transform genomics research from measuring various data pieces analytically into a fully interactive exploration of combined data in a 4D immersive visual environment that best matches a researcher's intuition.

  4. Computational design of synthetic regulatory networks from a genetic library to characterize the designability of dynamical behaviors.

    Science.gov (United States)

    Rodrigo, Guillermo; Carrera, Javier; Jaramillo, Alfonso

    2011-11-01

    The engineering of synthetic gene networks has mostly relied on the assembly of few characterized regulatory elements using rational design principles. It is of outmost importance to analyze the scalability and limits of such a design workflow. To analyze the design capabilities of libraries of regulatory elements, we have developed the first automated design approach that combines such elements to search the genotype space associated to a given phenotypic behavior. Herein, we calculated the designability of dynamical functions obtained from circuits assembled with a given genetic library. By designing circuits working as amplitude filters, pulse counters and oscillators, we could infer new mechanisms for such behaviors. We also highlighted the hierarchical design and the optimization of the interface between devices. We dissected the functional diversity of a constrained library and we found that even such libraries can provide a rich variety of behaviors. We also found that intrinsic noise slightly reduces the designability of digital circuits, but it increases the designability of oscillators. Finally, we analyzed the robust design as a strategy to counteract the evolvability and noise in gene expression of the engineered circuits within a cellular background, obtaining mechanisms for robustness through non-linear negative feedback loops.

  5. Genetic parameters between slaughter pig efficiency and growth rate of different body tissues estimated by computed tomography in live boars of Landrace and Duroc.

    Science.gov (United States)

    Gjerlaug-Enger, E; Kongsro, J; Odegård, J; Aass, L; Vangen, O

    2012-01-01

    In this study, computed tomography (CT) technology was used to measure body composition on live pigs for breeding purposes. Norwegian Landrace (L; n = 3835) and Duroc (D; n = 3139) boars, selection candidates to be elite boars in a breeding programme, were CT-scanned between August 2008 and August 2010 as part of an ongoing testing programme at Norsvin's boar test station. Genetic parameters in the growth rate of muscle (MG), carcass fat (FG), bone (BG) and non-carcass tissue (NCG), from birth to ∼100 kg live weight, were calculated from CT data. Genetic correlations between growth of different body tissues scanned using CT, lean meat percentage (LMP) calculated from CT and more traditional production traits such as the average daily gain (ADG) from birth to 25 kg (ADG1), the ADG from 25 kg to 100 kg (ADG2) and the feed conversion ratio (FCR) from 25 kg to 100 kg were also estimated from data on the same boars. Genetic parameters were estimated based on multi-trait animal models using the average information-restricted maximum likelihood (AI-REML) methodology. The heritability estimates (s.e. = 0.04 to 0.05) for the various traits for Landrace and Duroc were as follows: MG (0.19 and 0.43), FG (0.53 and 0.59), BG (0.37 and 0.58), NCG (0.38 and 0.50), LMP (0.50 and 0.57), ADG1 (0.25 and 0.48), ADG2 (0.41 and 0.42) and FCR (0.29 and 0.42). Genetic correlations for MG with LMP were 0.55 and 0.68, and genetic correlations between MG and ADG2 were -0.06 and 0.07 for Landrace and Duroc, respectively. LMP and ADG2 were clearly unfavourably genetically correlated (L: -0.75 and D: -0.54). These results showed the difficulty in jointly improving LMP and ADG2. ADG2 was unfavourably correlated with FG (L: 0.84 and D: 0.72), thus indicating to a large extent that selection for increased growth implies selection for fatness under an ad libitum feeding regime. Selection for MG is not expected to increase ADG2, but will yield faster growth of the desired tissues and a better

  6. The Binding Interface between Human APOBEC3F and HIV-1 Vif Elucidated by Genetic and Computational Approaches.

    Science.gov (United States)

    Richards, Christopher; Albin, John S; Demir, Özlem; Shaban, Nadine M; Luengas, Elizabeth M; Land, Allison M; Anderson, Brett D; Holten, John R; Anderson, John S; Harki, Daniel A; Amaro, Rommie E; Harris, Reuben S

    2015-12-01

    APOBEC3 family DNA cytosine deaminases provide overlapping defenses against pathogen infections. However, most viruses have elaborate evasion mechanisms such as the HIV-1 Vif protein, which subverts cellular CBF-β and a polyubiquitin ligase complex to neutralize these enzymes. Despite advances in APOBEC3 and Vif biology, a full understanding of this direct host-pathogen conflict has been elusive. We combine virus adaptation and computational studies to interrogate the APOBEC3F-Vif interface and build a robust structural model. A recurring compensatory amino acid substitution from adaptation experiments provided an initial docking constraint, and microsecond molecular dynamic simulations optimized interface contacts. Virus infectivity experiments validated a long-lasting electrostatic interaction between APOBEC3F E289 and HIV-1 Vif R15. Taken together with mutagenesis results, we propose a wobble model to explain how HIV-1 Vif has evolved to bind different APOBEC3 enzymes and, more generally, how pathogens may evolve to escape innate host defenses.

  7. The Binding Interface between Human APOBEC3F and HIV-1 Vif Elucidated by Genetic and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Christopher Richards

    2015-12-01

    Full Text Available APOBEC3 family DNA cytosine deaminases provide overlapping defenses against pathogen infections. However, most viruses have elaborate evasion mechanisms such as the HIV-1 Vif protein, which subverts cellular CBF-β and a polyubiquitin ligase complex to neutralize these enzymes. Despite advances in APOBEC3 and Vif biology, a full understanding of this direct host-pathogen conflict has been elusive. We combine virus adaptation and computational studies to interrogate the APOBEC3F-Vif interface and build a robust structural model. A recurring compensatory amino acid substitution from adaptation experiments provided an initial docking constraint, and microsecond molecular dynamic simulations optimized interface contacts. Virus infectivity experiments validated a long-lasting electrostatic interaction between APOBEC3F E289 and HIV-1 Vif R15. Taken together with mutagenesis results, we propose a wobble model to explain how HIV-1 Vif has evolved to bind different APOBEC3 enzymes and, more generally, how pathogens may evolve to escape innate host defenses.

  8. Genetic association of glutathione peroxidase-1 with coronary artery calcification in type 2 diabetes: a case control study with multi-slice computed tomography

    Directory of Open Access Journals (Sweden)

    Fujimoto Kei

    2007-09-01

    Full Text Available Abstract Background Although oxidative stress by accumulation of reactive oxygen species (ROS in diabetes has become evident, it remains unclear what genes, involved in redox balance, would determine susceptibility for development of atherosclerosis in diabetes. This study evaluated the effect of genetic polymorphism of enzymes producing or responsible for reducing ROS on coronary artery calcification in type 2 diabetes (T2D. Methods An index for coronary-arteriosclerosis, coronary artery calcium score (CACS was evaluated in 91 T2D patients using a multi-slice computed tomography. Patients were genotyped for ROS-scavenging enzymes, Glutathione peroxidase-1 (GPx-1, Catalase, Mn-SOD, Cu/Zn-SOD, as well as SNPs of NADPH oxidase as ROS-promoting elements, genes related to onset of T2D (CAPN10, ADRB3, PPAR gamma, FATP4. Age, blood pressure, BMI, HbA1c, lipid and duration of diabetes were evaluated for a multivariate regression analysis. Results CACS with Pro/Leu genotype of the GPx-1 gene was significantly higher than in those with Pro/Pro (744 ± 1,291 vs. 245 ± 399, respectively, p = 0.006. In addition, genotype frequency of Pro/Leu in those with CACS ≥ 1000 was significantly higher than in those with CACS OR = 3.61, CI = 0.97–13.42; p = 0.045 when tested for deviation from Hardy-Weinberg's equilibrium. Multivariate regression analyses revealed that CACS significantly correlated with GPx-1 genotypes and age. Conclusion The presence of Pro197Leu substitution of the GPx-1 gene may play a crucial role in determining genetic susceptibility to coronary-arteriosclerosis in T2D. The mechanism may be associated with a decreased ability to scavenge ROS with the variant GPx-1.

  9. Hunter disease eClinic: interactive, computer-assisted, problem-based approach to independent learning about a rare genetic disease

    Directory of Open Access Journals (Sweden)

    Moldovan Laura

    2010-10-01

    Full Text Available Abstract Background Computer-based teaching (CBT is a well-known educational device, but it has never been applied systematically to the teaching of a complex, rare, genetic disease, such as Hunter disease (MPS II. Aim To develop interactive teaching software functioning as a virtual clinic for the management of MPS II. Implementation and Results The Hunter disease eClinic, a self-training, user-friendly educational software program, available at the Lysosomal Storage Research Group (http://www.lysosomalstorageresearch.ca, was developed using the Adobe Flash multimedia platform. It was designed to function both to provide a realistic, interactive virtual clinic and instantaneous access to supporting literature on Hunter disease. The Hunter disease eClinic consists of an eBook and an eClinic. The eClinic is the interactive virtual clinic component of the software. Within an environment resembling a real clinic, the trainee is instructed to perform a medical history, to examine the patient, and to order appropriate investigation. The program provides clinical data derived from the management of actual patients with Hunter disease. The eBook provides instantaneous, electronic access to a vast collection of reference information to provide detailed background clinical and basic science, including relevant biochemistry, physiology, and genetics. In the eClinic, the trainee is presented with quizzes designed to provide immediate feedback on both trainee effectiveness and efficiency. User feedback on the merits of the program was collected at several seminars and formal clinical rounds at several medical centres, primarily in Canada. In addition, online usage statistics were documented for a 2-year period. Feedback was consistently positive and confirmed the practical benefit of the program. The online English-language version is accessed daily by users from all over the world; a Japanese translation of the program is also available. Conclusions The

  10. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    Science.gov (United States)

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-01

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier. PMID:28124985

  11. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Charles Yaacoub

    2017-01-01

    Full Text Available Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5% while improving the accuracy, sensitivity, specificity, and precision of the classifier.

  12. 基于遗传算法的计算机通信网的拓扑优化设计%Optimal Design of a Computer Communication Network Based on Genetic Algorithms

    Institute of Scientific and Technical Information of China (English)

    陈国龙

    2002-01-01

    The optimal design of a computer communication network belongs to NP-complete problem. It's hard to get the global solution using the traditional algorithm. Genetic algorithms are a natural evolution-based heuristic search method, which have been successfully applied to a variety of problems. The difficulties in using the algorithm are how a particular problem is to be modeled to fit into the genetic algorithm framework, and how the operators (selection, crossover, mutation ) work due to the code strings. In this paper, authors establish a model for optimal design of networks, which is maximization of network reliability subject to a given cost constraint, and offer a corresponding modified genetic algorithms. Two examples are provided. The numerical results show the algorithm given in this paper has an idea solution speed and can get the optimal solution easily, and is also feasible to large scale problems.

  13. mPed : a computer program for converting pedigree data to a format used by the PMx-software for conservation genetic analysis

    OpenAIRE

    Jansson, Mija; Ståhl, Ingvar; Laikre, Linda

    2013-01-01

    There is a growing need for conservation genetic management of animal populations when individual relatedness data (pedigrees) are available. Such data can be used to monitor rates of inbreeding and loss of genetic diversity. Traditionally, pedigree analysis for conservationmanagement has focused on zoo populations of threatened wild animals; available software has been developed in that context. Population Management x (PMx) is a free software for estimating genetic parameters including inbr...

  14. Computer program for allocation of generators in isolated systems of direct current using genetic algorithm; Programa computacional para alocacao de geradores em sistemas isolados de corrente continua utilizando algoritmo genetico

    Energy Technology Data Exchange (ETDEWEB)

    Gewehr, Diego N.; Vargas, Ricardo B.; Melo, Eduardo D. de; Paschoareli Junior, Dionizio [Universidade Estadual Paulista (DEE/UNESP), Ilha Solteira, SP (Brazil). Dept. de Engenharia Eletrica. Grupo de Pesquisa em Fontes Alternativas e Aproveitamento de Energia

    2008-07-01

    This paper presents a methodology for electric power sources location in isolated direct current micro grids, using genetic algorithm. In this work, photovoltaic panels are considered, although the methodology can be extended for any kind of DC sources. A computational tool is developed using the Matlab simulator, to obtain the best dc system configuration for reduction of panels quantity and costs, and to improve the system performance. (author)

  15. Genetic Discrimination

    Science.gov (United States)

    ... in Genetics Archive Regulation of Genetic Tests Genetic Discrimination Overview Many Americans fear that participating in research ... I) and employment (Title II). Read more Genetic Discrimination and Other Laws Genetic Discrimination and Other Laws ...

  16. Research of Task Scheduling Based on Genetic Algorithm Technology in Cloud Computing Environment%云计算环境下基于遗传算法的任务调度技术研究

    Institute of Scientific and Technical Information of China (English)

    马俊涛; 陈业恩; 胡国杰; 严丽丽

    2016-01-01

    Genetic algorithm is a method of searching the optimal solution process by simulating natural evolution ,at pres-ent ,the genetic algorithm as the cloud computing task scheduling environment has gradually become a hot topic ,cloud computing as a new distributed computing model ,its the large number of dispersed resources through the network needed to be allocated by the user ,the implementation of resource allocation ,task scheduling technology will directly determine their "cloud computing"the level of performance .This paper discusses cloud computer task scheduling algorithm based on the current status of research on genetic algorithms ,problems and issues to be resolved ,as further pointed out the direc-tion of further research .%遗传算法是一种通过模拟自然进化过程搜索最优解的方法.目前 ,将遗传算法作为云计算环境下的任务调度算法已逐渐成为研究热点.云计算作为一种全新的分布式计算模式 ,通过网络将大量分散资源按用户所需进行分配 ,其实施资源分配、任务调度的技术将直接决定"云计算"性能的高低.探讨当前云计算中基于遗传算法的任务调度技术研究现状及有待解决的问题 ,为进一步研究指出方向.

  17. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  18. Introduction to morphogenetic computing

    CERN Document Server

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  19. Inferring the origin of populations introduced from a genetically structured native range by approximate Bayesian computation: case study of the invasive ladybird Harmonia axyridis

    NARCIS (Netherlands)

    Lombaert, E.; Guillemaud, T.; Thomas, C.E.; Handley, L.J.L.; Li, J.; Wang, S.; Pang, H.; Goryacheva, I.; Zakharov, I.A.; Jousselin, E.; Poland, R.L.; Migeon, A.; Lenteren, van J.C.; Clercq, de P.; Berkvens, N.; Jones, W.; Estoup, A.

    2011-01-01

    Correct identification of the source population of an invasive species is a prerequisite for testing hypotheses concerning the factors responsible for biological invasions. The native area of invasive species may be large, poorly known and/or genetically structured. Because the actual source populat

  20. Inferring the origin of populations introduced from a genetically structured native range by approximate Bayesian computation: case study of the invasive ladybird Harmonia axyridis

    NARCIS (Netherlands)

    Lombaert, E.; Guillemaud, T.; Thomas, C.E.; Handley, L.J.L.; Li, J.; Wang, S.; Pang, H.; Goryacheva, I.; Zakharov, I.A.; Jousselin, E.; Poland, R.L.; Migeon, A.; Lenteren, van J.C.; Clercq, de P.; Berkvens, N.; Jones, W.; Estoup, A.

    2011-01-01

    Correct identification of the source population of an invasive species is a prerequisite for testing hypotheses concerning the factors responsible for biological invasions. The native area of invasive species may be large, poorly known and/or genetically structured. Because the actual source populat

  1. Nested Genetic Algorithm for Resolving Overlapped Spectral Bands

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A nested genetic algorithm, including genetic parameter level and genetic implemented level for peak parameters, was proposed and applied for resolving overlapped spectral bands. By the genetic parameter level, parameters of genetic algorithm were optimized; moreover, the number of overlapped peaks was determined simultaneously. Then parameters of individual peaks were computed with the genetic implemented level.

  2. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  3. Genetic Mapping

    Science.gov (United States)

    ... Fact Sheets Fact Sheets En Español: Mapeo Genético Genetic Mapping What is genetic mapping? How do researchers create ... genetic map? What are genetic markers? What is genetic mapping? Among the main goals of the Human Genome ...

  4. Genetic Counseling

    Science.gov (United States)

    Genetic counseling provides information and support to people who have, or may be at risk for, genetic disorders. A ... meets with you to discuss genetic risks. The counseling may be for yourself or a family member. ...

  5. Genetics of ischaemic stroke.

    Science.gov (United States)

    Sharma, Pankaj; Yadav, Sunaina; Meschia, James F

    2013-12-01

    Recent advances in genomics and statistical computation have allowed us to begin addressing the genetic basis of stroke at a molecular level. These advances are at the cusp of making important changes to clinical practice of some monogenic forms of stroke and, in the future, are likely to revolutionise the care provided to these patients. In this review we summarise the state of knowledge in ischaemic stroke genetics particularly in the context of how a practicing clinician can best use this knowledge.

  6. Genetic Algorithm for Solving Simple Mathematical Equality Problem

    OpenAIRE

    Hermawanto, Denny

    2013-01-01

    This paper explains genetic algorithm for novice in this field. Basic philosophy of genetic algorithm and its flowchart are described. Step by step numerical computation of genetic algorithm for solving simple mathematical equality problem will be briefly explained

  7. Genetic toxicology: web resources.

    Science.gov (United States)

    Young, Robert R

    2002-04-25

    available online in the field of genetic toxicology. As molecular biology and computational tools improve, new areas within genetic toxicology such as structural activity relationship analysis, mutational spectra databases and toxicogenomics, now have resources online as well.

  8. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  9. Pedigree-based estimation of covariance between dominance deviations and additive genetic effects in closed rabbit lines considering inbreeding and using a computationally simpler equivalent model.

    Science.gov (United States)

    Fernández, E N; Legarra, A; Martínez, R; Sánchez, J P; Baselga, M

    2017-06-01

    Inbreeding generates covariances between additive and dominance effects (breeding values and dominance deviations). In this work, we developed and applied models for estimation of dominance and additive genetic variances and their covariance, a model that we call "full dominance," from pedigree and phenotypic data. Estimates with this model such as presented here are very scarce both in livestock and in wild genetics. First, we estimated pedigree-based condensed probabilities of identity using recursion. Second, we developed an equivalent linear model in which variance components can be estimated using closed-form algorithms such as REML or Gibbs sampling and existing software. Third, we present a new method to refer the estimated variance components to meaningful parameters in a particular population, i.e., final partially inbred generations as opposed to outbred base populations. We applied these developments to three closed rabbit lines (A, V and H) selected for number of weaned at the Polytechnic University of Valencia. Pedigree and phenotypes are complete and span 43, 39 and 14 generations, respectively. Estimates of broad-sense heritability are 0.07, 0.07 and 0.05 at the base versus 0.07, 0.07 and 0.09 in the final generations. Narrow-sense heritability estimates are 0.06, 0.06 and 0.02 at the base versus 0.04, 0.04 and 0.01 at the final generations. There is also a reduction in the genotypic variance due to the negative additive-dominance correlation. Thus, the contribution of dominance variation is fairly large and increases with inbreeding and (over)compensates for the loss in additive variation. In addition, estimates of the additive-dominance correlation are -0.37, -0.31 and 0.00, in agreement with the few published estimates and theoretical considerations. © 2017 Blackwell Verlag GmbH.

  10. Dissecting genomic imprinting and genetic conflict from a game theory prospective. Comment on: ;Epigenetic game theory: How to compute the epigenetic control of maternal-to-zygotic transition; by Qian Wang et al.

    Science.gov (United States)

    Cui, Yuehua; Yang, Haitao

    2017-03-01

    Epigenetics typically refers to changes in the structure of a chromosome that affect gene activity and expression. Genomic imprinting is a special type of epigenetic phenomenon in which the expression of an allele depends on its parental origin. When an allele inherited from the mother (or father) is imprinted (i.e., silent), it is termed as maternal (or paternal) imprinting. Imprinting is often resulted from DNA methylation and tends to cluster together in the genome [1]. It has been shown to play a key role in many genetic disorders in humans [2]. Imprinting is heritable and undergoes a reprogramming process in gametes before and after fertilization [1]. Sometimes the reprogramming process is not reversible, leading to the loss of imprinting [3]. Although efforts have been made to experimentally or computationally infer imprinting genes, the underlying molecular mechanism that leads to unbalanced allelic expression is still largely unknown.

  11. Genetic algorithm optimization of entanglement

    CERN Document Server

    Navarro-Munoz, J C; Rosu, H C; Navarro-Munoz, Jorge C.

    2006-01-01

    We present an application of a genetic algorithmic computational method to the optimization of the concurrence measure of entanglement for the cases of one dimensional chains, as well as square and triangular lattices in a simple tight-binding approach

  12. Identical twins in forensic genetics

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Morling, Niels

    2015-01-01

    The increase in the number of forensic genetic loci used for identification purposes results in infinitesimal random match probabilities. These probabilities are computed under assumptions made for rather simple population genetic models. Often, the forensic expert reports likelihood ratios, where...... published results accounting for close familial relationships. However, we revisit the discussion to increase the awareness among forensic genetic practitioners and include new information on medical and societal factors to assess the risk of not considering a monozygotic twin as the true perpetrator...

  13. Graphical models for genetic analyses

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt; Sheehan, Nuala A.

    2003-01-01

    This paper introduces graphical models as a natural environment in which to formulate and solve problems in genetics and related areas. Particular emphasis is given to the relationships among various local computation algorithms which have been developed within the hitherto mostly separate areas...... of graphical models and genetics. The potential of graphical models is explored and illustrated through a number of example applications where the genetic element is substantial or dominating....

  14. 云计算环境下基于用户满意度的遗传算法%Consumer satisfaction genetic algorithm in cloud computing

    Institute of Scientific and Technical Information of China (English)

    邹伟明; 于炯

    2014-01-01

    针对云计算平台的新特征,对原有自适应遗传算法进行改进,提出了一种基于用户满意度的遗传算法(consumer satisfaction genetic algorithm,CSGA).该算法在保证用户公平性的前提下,将任务调度到输入数据所在的计算节点以减少网络传输开销,并以缩短总任务的完成时间及提高用户满意度为目标优化算法性能.通过仿真实验对比分析CSGA与AGA算法,实验结果表明该算法在响应时间、公平性和用户满意度方面优于AGA算法,更加适应云计算环境.

  15. Modeling and Multi-Objective Optimization of Engine Performance and Hydrocarbon Emissions via the Use of a Computer Aided Engineering Code and the NSGA-II Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Richard Fiifi Turkson

    2016-01-01

    Full Text Available It is feared that the increasing population of vehicles in the world and the depletion of fossil-based fuel reserves could render transportation and other activities that rely on fossil fuels unsustainable in the long term. Concerns over environmental pollution issues, the high cost of fossil-based fuels and the increasing demand for fossil fuels has led to the search for environmentally friendly, cheaper and efficient fuels. In the search for these alternatives, liquefied petroleum gas (LPG has been identified as one of the viable alternatives that could be used in place of gasoline in spark-ignition engines. The objective of the study was to present the modeling and multi-objective optimization of brake mean effective pressure and hydrocarbon emissions for a spark-ignition engine retrofitted to run on LPG. The use of a one-dimensional (1D GT-Power™ model, together with Group Method of Data Handling (GMDH neural networks, has been presented. The multi-objective optimization was implemented in MATLAB® using the non-dominated sorting genetic algorithm (NSGA-II. The modeling process generally achieved low mean squared errors (0.0000032 in the case of the hydrocarbon emissions model for the models developed and was attributed to the collection of a larger training sample data using the 1D engine model. The multi-objective optimization and subsequent decisions for optimal performance have also been presented.

  16. A computer model allowing maintenance of large amounts of genetic variability in Mendelian populations. II. The balance of forces between linkage and random assortment.

    Science.gov (United States)

    Wills, C; Miller, C

    1976-02-01

    It is shown, through theory and computer simulations of outbreeding Mendelian populations, that there may be conditions under which a balance is struck between two facotrs. The first is the advantage of random assortment, which will, when multilocus selection is for intermediate equilibrium values, lead to higher average heterozygosity than when linkage is introduced. There is some indication that random assortment is also advantageous when selection is toward a uniform distribution of equilibrium values. The second factor is the advantage of linkage between loci having positive epistatic interactions. When multilocus selection is for a bimodal distribution of equilibrium values an early advantage of random assortment is replaced by a later disadvantage. Linkage disequilibrium, which in finite populations is increased only by random or selective sampling, may hinder the movement of alleles to their selective equilibria, thus leading to the advantage of random assortment.-Some consequences of this approach to the structure of natural populations are discussed.

  17. Genetic Disorders

    Science.gov (United States)

    ... This can cause a medical condition called a genetic disorder. You can inherit a gene mutation from ... during your lifetime. There are three types of genetic disorders: Single-gene disorders, where a mutation affects ...

  18. Genetic modification and genetic determinism.

    Science.gov (United States)

    Resnik, David B; Vorhaus, Daniel B

    2006-06-26

    In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions.

  19. Imaging Genetics

    Science.gov (United States)

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  20. Genetic principles.

    Science.gov (United States)

    Abuelo, D

    1987-01-01

    The author discusses the basic principles of genetics, including the classification of genetic disorders and a consideration of the rules and mechanisms of inheritance. The most common pitfalls in clinical genetic diagnosis are described, with emphasis on the problem of the negative or misleading family history.

  1. Imaging Genetics

    Science.gov (United States)

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  2. Genetic modification and genetic determinism

    OpenAIRE

    Vorhaus Daniel B; Resnik David B

    2006-01-01

    Abstract In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound....

  3. Cartesian genetic programming

    CERN Document Server

    Miller, Julian F

    2011-01-01

    Cartesian Genetic Programming (CGP) is a highly effective and increasingly popular form of genetic programming. It represents programs in the form of directed graphs, and a particular characteristic is that it has a highly redundant genotype - phenotype mapping, in that genes can be noncoding. It has spawned a number of new forms, each improving on the efficiency, among them modular, or embedded, CGP, and self-modifying CGP. It has been applied to many problems in both computer science and applied sciences. This book contains chapters written by the leading figures in the development and appli

  4. Microsatellite data analysis for population genetics.

    Science.gov (United States)

    Kim, Kyung Seok; Sappington, Thomas W

    2013-01-01

    Theories and analytical tools of population genetics have been widely applied for addressing various questions in the fields of ecological genetics, conservation biology, and any context where the role of dispersal or gene flow is important. Underlying much of population genetics is the analysis of variation at selectively neutral marker loci, and microsatellites continue to be a popular choice of marker. In recent decades, software programs to estimate population genetics parameters have been developed at an increasing pace as computational science and theoretical knowledge advance. Numerous population genetics software programs are presently available to analyze microsatellite genotype data, but only a handful are commonly employed for calculating parameters such as genetic variation, genetic structure, patterns of spatial and temporal gene flow, population demography, individual population assignment, and genetic relationships within and between populations. In this chapter, we introduce statistical analyses and relevant population genetic software programs that are commonly employed in the field of population genetics and molecular ecology.

  5. Genetic barcodes

    Energy Technology Data Exchange (ETDEWEB)

    Weier, Heinz -Ulrich G

    2015-08-04

    Herein are described multicolor FISH probe sets termed "genetic barcodes" targeting several cancer or disease-related loci to assess gene rearrangements and copy number changes in tumor cells. Two, three or more different fluorophores are used to detect the genetic barcode sections thus permitting unique labeling and multilocus analysis in individual cell nuclei. Gene specific barcodes can be generated and combined to provide both numerical and structural genetic information for these and other pertinent disease associated genes.

  6. Handbook of technology law. General funamentals, environment law, genetic engineering act, energy act, telecommunication act and media act, patent act, computer act. 2. ed.; Handbuch des Technikrechts. Allgemeine Grundlagen Umweltrecht, Gentechnikrecht, Energierecht, Telekommunikations- und Medienrecht, Patentrecht, Computerrecht

    Energy Technology Data Exchange (ETDEWEB)

    Schulte, Martin; Schroeder, Rainer (eds.) [Technische Univ. Dresden (Germany). Juristische Fakultaet

    2011-07-01

    On the boundaries between technology sciences, jurisprudence, social sciences and economic science the technology law proves as a cross-sectional area par excellence. The bases of the technology law are presented: individual, particularly important scopes of the technology law (appliance safety regulations, technology law and environment law, genetic engineering act, energy right, telecommunications law and media law, patent law, computer law, data security, legally binding telecooperation) are analyzed in detail. The manual contacts all lawyers who want to provide a first in-depth insight of this new field of law. [German] Im Grenzbereich von Technik-, Rechts-, Sozial- und Wirtschaftswissenschaften erweist sich das Technikrecht als Querschnittsmaterie par excellence. Die Grundlagen des Technikrechts werden dargestellt; einzelne, besonders wichtige Bereiche des Technikrechts (Geraetesicherheitsrecht, Technik und Umweltrecht, Gentechnikrecht, Energierecht, Telekommunikations- und Medienrecht, Patentrecht, Computerrecht, Datensicherheit, Rechtsverbindliche Telekooperation) werden eingehend analysiert. Das Handbuch wendet sich an alle in Wissenschaft und Praxis mit dem Technikrecht befassten Juristen, die sich einen ersten vertieften Einblick in dieses neue Rechtsgebiet verschaffen wollen. (orig.)

  7. Genetic modification and genetic determinism

    Directory of Open Access Journals (Sweden)

    Vorhaus Daniel B

    2006-06-01

    Full Text Available Abstract In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions.

  8. Genetic Engineering

    Science.gov (United States)

    Phillips, John

    1973-01-01

    Presents a review of genetic engineering, in which the genotypes of plants and animals (including human genotypes) may be manipulated for the benefit of the human species. Discusses associated problems and solutions and provides an extensive bibliography of literature relating to genetic engineering. (JR)

  9. Genetic Counseling

    Science.gov (United States)

    ... for certain types of genetic conditions (such as Down syndrome) in the baby if mother-to-be is 35 years of age or more, or is concerned at any age about her chances of having a child with a genetic condition To learn about the ...

  10. Genetic Romanticism

    DEFF Research Database (Denmark)

    Tupasela, Aaro

    2016-01-01

    . This article compares and contrasts the work of two doctors in Finland, Elias Lönnrot and Reijo Norio, working over a century and a half apart, to examine the ways in which they have contributed to the formation of national identity and unity. The notion of genetic romanticism is introduced as a term...... to complement the notion of national romanticism that has been used to describe the ways in which nineteenth-century scholars sought to create and deploy common traditions for national-romantic purposes. Unlike national romanticism, however, strategies of genetic romanticism rely on the study of genetic...... inheritance as a way to unify populations within politically and geographically bounded areas. Thus, new genetics have contributed to the development of genetic romanticisms, whereby populations (human, plant, and animal) can be delineated and mobilized through scientific and medical practices to represent...

  11. [Genetic aspects of genealogy].

    Science.gov (United States)

    Tetushkin, E Iu

    2011-11-01

    The supplementary historical discipline genealogy is also a supplementary genetic discipline. In its formation, genetics borrowed from genealogy some methods of pedigree analysis. In the 21th century, it started receiving contribution from computer-aided genealogy and genetic (molecular) genealogy. The former provides novel tools for genetics, while the latter, which employing genetic methods, enriches genetics with new evidence. Genealogists formulated three main laws ofgenealogy: the law of three generations, the law of doubling the ancestry number, and the law of declining ancestry. The significance and meaning of these laws can be fully understood only in light of genetics. For instance, a controversy between the exponential growth of the number of ancestors of an individual, i.e., the law of doubling the ancestry number, and the limited number of the humankind is explained by the presence of weak inbreeding because of sibs' interference; the latter causes the pedigrees' collapse, i.e., explains also the law of diminishing ancestry number. Mathematic modeling of pedigrees' collapse presented in a number of studies showed that the number of ancestors of each individual attains maximum in a particular generation termed ancestry saturated generation. All representatives of this and preceding generation that left progeny are common ancestors of all current members of the population. In subdivided populations, these generations are more ancient than in panmictic ones, whereas in small isolates and social strata with limited numbers of partners, they are younger. The genealogical law of three generations, according to which each hundred years contain on average three generation intervals, holds for generation lengths for Y-chromosomal DNA, typically equal to 31-32 years; for autosomal and mtDNA, this time is somewhat shorter. Moving along ascending lineas, the number of genetically effective ancestors transmitting their DNA fragment to descendants increases far

  12. A Strategy of Energy-Efficency in Heterogeneous Cloud Computing Based on Classic Genetic Algorithm%一种经典遗传算法下的异构云环境能效优化策略

    Institute of Scientific and Technical Information of China (English)

    周航; 朱海; 齐迎春

    2013-01-01

    By analyzing some general energy consumption model from the Energy-Efficency(EE) perspective, an EE model was presented for applying to Cloud-Computing data centre ’ s actual scenario .By analyzing the mathematical formula of the EE model , a corresponding strategy of EE optimization was presented and verified on the CloudSim Plat -form.The experimental results showed that the strategy based on the classic genetic algorithm can significantly improve the integral EE value of server cluster .Furthermore, the robustness of our strategy was also validated by modifying the relevant parameters of the experiment .%从高能效的角度分析目前一些通用能耗模型,提出了一种适用于云数据中心实际应用场景的能效模型。根据此能效模型数学公式的分析,提出了相应的能效优化策略,并通过CloudSim仿真平台进行验证。结果表明,相比于Hadoop等云平台下传统的任务均分策略,应用经典遗传算法的能效优化策略能明显提高服务器集群的整体能效值。通过修改实验的相关参数,进一步验证了该能效优化策略的健壮性。

  13. Genetic Breakthrough

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A new calf breeding technique shows promise for treating malignant tumors Chinese scientists have successfully bred a genetically altered cow capable of producing cancer-curing proteins for human beings.

  14. Mitochondrial genetics

    OpenAIRE

    Chinnery, Patrick Francis; Hudson, Gavin

    2013-01-01

    Introduction In the last 10 years the field of mitochondrial genetics has widened, shifting the focus from rare sporadic, metabolic disease to the effects of mitochondrial DNA (mtDNA) variation in a growing spectrum of human disease. The aim of this review is to guide the reader through some key concepts regarding mitochondria before introducing both classic and emerging mitochondrial disorders. Sources of data In this article, a review of the current mitochondrial genetics literature was con...

  15. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  16. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  17. Routine Discovery of Complex Genetic Models using Genetic Algorithms.

    Science.gov (United States)

    Moore, Jason H; Hahn, Lance W; Ritchie, Marylyn D; Thornton, Tricia A; White, Bill C

    2004-02-01

    Simulation studies are useful in various disciplines for a number of reasons including the development and evaluation of new computational and statistical methods. This is particularly true in human genetics and genetic epidemiology where new analytical methods are needed for the detection and characterization of disease susceptibility genes whose effects are complex, nonlinear, and partially or solely dependent on the effects of other genes (i.e. epistasis or gene-gene interaction). Despite this need, the development of complex genetic models that can be used to simulate data is not always intuitive. In fact, only a few such models have been published. We have previously developed a genetic algorithm approach to discovering complex genetic models in which two single nucleotide polymorphisms (SNPs) influence disease risk solely through nonlinear interactions. In this paper, we extend this approach for the discovery of high-order epistasis models involving three to five SNPs. We demonstrate that the genetic algorithm is capable of routinely discovering interesting high-order epistasis models in which each SNP influences risk of disease only through interactions with the other SNPs in the model. This study opens the door for routine simulation of complex gene-gene interactions among SNPs for the development and evaluation of new statistical and computational approaches for identifying common, complex multifactorial disease susceptibility genes.

  18. Measuring Financial Gains from Genetically Superior Trees

    Science.gov (United States)

    George Dutrow; Clark Row

    1976-01-01

    Planting genetically superior loblolly pines will probably yield high profits.Forest economists have made computer simulations that predict financial gains expected from a tree improvement program under actual field conditions.

  19. Genetic GIScience

    DEFF Research Database (Denmark)

    Jacquez, Geoffrey; Sabel, Clive E; Shi, Chen

    2015-01-01

    The exposome, defined as the totality of an individual's exposures over the life course, is a seminal concept in the environmental health sciences. Although inherently geographic, the exposome as yet is unfamiliar to many geographers. This article proposes a place-based synthesis, genetic...... geographic information science (genetic GIScience), that is founded on the exposome, genome+, and behavome. It provides an improved understanding of human health in relation to biology (the genome+), environmental exposures (the exposome), and their social, societal, and behavioral determinants (the behavome......). Genetic GIScience poses three key needs: first, a mathematical foundation for emergent theory; second, process-based models that bridge biological and geographic scales; third, biologically plausible estimates of space?time disease lags. Compartmental models are a possible solution; this article develops...

  20. Fluid Genetic Algorithm (FGA

    Directory of Open Access Journals (Sweden)

    Ruholla Jafari-Marandi

    2017-04-01

    Full Text Available Genetic Algorithm (GA has been one of the most popular methods for many challenging optimization problems when exact approaches are too computationally expensive. A review of the literature shows extensive research attempting to adapt and develop the standard GA. Nevertheless, the essence of GA which consists of concepts such as chromosomes, individuals, crossover, mutation, and others rarely has been the focus of recent researchers. In this paper method, Fluid Genetic Algorithm (FGA, some of these concepts are changed, removed, and furthermore, new concepts are introduced. The performance of GA and FGA are compared through seven benchmark functions. FGA not only shows a better success rate and better convergence control, but it can be applied to a wider range of problems including multi-objective and multi-level problems. Also, the application of FGA for a real engineering problem, Quadric Assignment Problem (AQP, is shown and experienced.

  1. Graphical models for genetic analyses

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt; Sheehan, Nuala A.

    2003-01-01

    This paper introduces graphical models as a natural environment in which to formulate and solve problems in genetics and related areas. Particular emphasis is given to the relationships among various local computation algorithms which have been developed within the hitherto mostly separate areas...

  2. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap and ...

  3. RNA genetics

    Energy Technology Data Exchange (ETDEWEB)

    Domingo, E. (Instituto de Biologia Molecular, Facultad de Ciencias, Universidad Autonoma de Madrid, Canto Blanco, Madrid (ES)); Holland, J.J. (California Univ., San Diego, La Jolla, CA (USA). Dept. of Biology); Ahlquist, P. (Wisconsin Univ., Madison, WI (USA). Dept. of Plant Pathology)

    1988-01-01

    This book contains the proceedings on RNA genetics: RNA-directed virus replication Volume 1. Topics covered include: Replication of the poliovirus genome; Influenza viral RNA transcription and replication; and Relication of the reoviridal: Information derived from gene cloning and expression.

  4. Genetic counseling

    Science.gov (United States)

    ... MF, eds. Creasy and Resnik's Maternal-Fetal Medicine: Principles and Practice . 7th ed. Philadelphia, PA: Elsevier Saunders; 2014:chap 30. Review Date 1/25/2016 Updated by: Chad Haldeman-Englert, MD, FACMG, Fullerton Genetics Center, Asheville, NC. Review provided by VeriMed Healthcare ...

  5. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  6. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  7. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  8. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  9. Computational chemistry

    OpenAIRE

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  10. Melanoma genetics

    DEFF Research Database (Denmark)

    Read, Jazlyn; Wadt, Karin A W; Hayward, Nicholas K

    2016-01-01

    Approximately 10% of melanoma cases report a relative affected with melanoma, and a positive family history is associated with an increased risk of developing melanoma. Although the majority of genetic alterations associated with melanoma development are somatic, the underlying presence...... of heritable melanoma risk genes is an important component of disease occurrence. Susceptibility for some families is due to mutation in one of the known high penetrance melanoma predisposition genes: CDKN2A, CDK4, BAP1, POT1, ACD, TERF2IP and TERT. However, despite such mutations being implicated...... in a combined total of approximately 50% of familial melanoma cases, the underlying genetic basis is unexplained for the remainder of high-density melanoma families. Aside from the possibility of extremely rare mutations in a few additional high penetrance genes yet to be discovered, this suggests a likely...

  11. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  12. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  13. Genetic Testing for ALS

    Science.gov (United States)

    ... Involved Donate Familial Amyotrophic Lateral Sclerosis (FALS) and Genetic Testing By Deborah Hartzfeld, MS, CGC, Certified Genetic Counselor ... in your area, please visit www.nsgc.org . Genetic Testing Genetic testing can help determine the cause of ...

  14. Genetic Science Learning Center

    Science.gov (United States)

    ... Mouse Party on Learn.Genetics.utah.edu Students doing the Tree of Genetic Traits activity Learn.Genetics is one of the most widely used science education websites in the world The Community Genetics ...

  15. Contextual Computing

    CERN Document Server

    Porzel, Robert

    2011-01-01

    This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.

  16. Computer Algebra.

    Science.gov (United States)

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  17. Computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  18. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  19. Quantum computing

    OpenAIRE

    Li, Shu-Shen; Long, Gui-lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization.

  20. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  1. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  2. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...... cybernetics and Maturana and Varela’s theory of autopoiesis, which are both erroneously taken to support info-computationalism....

  3. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  4. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  5. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  6. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...

  7. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  8. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  9. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...

  10. Computer Ease.

    Science.gov (United States)

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  11. Genetic Programming with Simple Loops

    Institute of Scientific and Technical Information of China (English)

    QI Yuesheng; WANG Baozhong; KANG Lishan

    1999-01-01

    A kind of loop function LoopN inGenetic Programming (GP) is proposed.Different from other forms of loopfunction, such as While-Do and Repeat-Until, LoopNtakes only oneargument as its loop body and makes its loop body simply run N times,soinfinite loops will never happen. The problem of how to avoid too manylayers ofloops in Genetic Programming is also solved. The advantage ofLoopN in GP is shown bythe computational results in solving the mowerproblem.

  12. Surface/Surface Intersection Using Simulated Annealing Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The genetic algorithm and marching method are integrated into a novel algorithm to solve the surface intersection problem. By combining genetic algorithm with local searching method the efficiency of evolution is greatly improved. By fully utilizing the global searching ability and instinct attribute for parallel computation of genetic algorithm and the local rapid convergency of marching method, the algorithm can compute the intersection robustly and generate correct topology of intersection curves. The details of the new algorithm are discussed here.

  13. Computational Intelligence and Its Encoding Mechanism

    Institute of Scientific and Technical Information of China (English)

    LIU Man-dan

    2004-01-01

    The origin and characteristics of computational intelligence, and several typical computational intelligence algorithms such as genetic algorithm and DNA computing are described, and the influence of evolution strategies and convergence properties on the encoding mechanism is discussed. A novel genetic algorithm based on degressive carry number encoding is then proposed. This algorithm uses degressive carry number encoding in the evolutionary process instead of commonly used fixed carry number. Finally a novel encoding mechanism and a new algorithm are proposed, which combine modern computational intelligence with the traditional Chinese methodology.

  14. Computational Intelligence and Its Encoding Mechanism

    Institute of Scientific and Technical Information of China (English)

    LIUMan-dan

    2004-01-01

    The origin and characteristics of computational intelligence, and several typical computational intelligence algorithms such as genetic algorithm and DNA computing are described, and the influence of evolution strategies and convergence properties on the encoding mechanism is discussed. A novel genetic algorithm based on degressive carry number encoding is then proposed. This algorithm uses degressive carry number encoding in the evolutionary process instead of commonly used fixed carry number. Finally a novel encoding mechanism and a new algorithm are proposed, which combine modem computational intelligence with the traditional Chinese methodology.

  15. An overview on Approximate Bayesian computation*

    Directory of Open Access Journals (Sweden)

    Baragatti Meïli

    2014-01-01

    Full Text Available Approximate Bayesian computation techniques, also called likelihood-free methods, are one of the most satisfactory approach to intractable likelihood problems. This overview presents recent results since its introduction about ten years ago in population genetics.

  16. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  17. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  18. Computer Science Research: Computation Directorate

    Energy Technology Data Exchange (ETDEWEB)

    Durst, M.J. (ed.); Grupe, K.F. (ed.)

    1988-01-01

    This report contains short papers in the following areas: large-scale scientific computation; parallel computing; general-purpose numerical algorithms; distributed operating systems and networks; knowledge-based systems; and technology information systems.

  19. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  20. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  1. [Genetic amniocentesis].

    Science.gov (United States)

    Violante Díaz, M; Carrillo Hinojosa, M; García Necoechea, M P; Escobedo Aguirre, F; Lowenberg Favela, E; Ahued Ahued, J R

    1989-04-01

    179 patients were studied by genetic amniocentesis (GA) in sessions of 3 punctures each. This was done in order to follow a prenatal diagnosis (PD) program and study amniotic fluid at the Hospital Regional 20 de Novembre (ISSSTE) between May 1983 and December 1987. The parameters taken were: age, indications, number of sessions, number punctures, echosonographic studies for gestational age, placental insertion, punction site, amniotic fluid volume, blood contamination, failures and handling of the patient. A low incidence of abortion is reported. We don't have cases of dripping of amniotic fluid or transvaginal haemorrhage. Multiple insertion of the needle and placental or vessel lesions of the cord, as causes of a fetal death are still argued if we have in mind avoiding chances; we didn't have those complications in our cases. The percent is low if there are not previous spontaneous abortions. 79% of the amniotic fluid samples were sent between the 15th and 17th weeks of pregnancy. For alpha fetus protein determination 12 and for biochemical studies 1, specially for beta-galactosidase level. This was done at the Biomedical Investigation Institute of the National Autonomous University of Mexico (in parents with generalized gangliosidosis GM1). Even though results were good, the technique has still risks and complications. An ultrasonic study of the procedures made by physicians with trustable experience is needed. Our country has the need to create more Prenatal Genetic Diagnosis Centers.

  2. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  3. Evolutionary Computing in Visual Art and Music

    OpenAIRE

    Johnson, Colin G.; Romero Cardalda, Juan J.

    2002-01-01

    This paper is an introduction to the special section of Leonardo on Genetic Algorithms in Visual Art and Music, which arose from a workshop at the 2000 Genetic and Evolutionary Computing Conference. This introduction gives a background review of the area, takes a look at some open questions provoked by the workshop, and summarizes the papers in the section.

  4. Coevolutionary computation.

    Science.gov (United States)

    Paredis, J

    1995-01-01

    This article proposes a general framework for the use of coevolution to boost the performance of genetic search. It combines coevolution with yet another biologically inspired technique, called lifetime fitness evaluation (LTFE). Two unrelated problems--neural net learning and constraint satisfaction--are used to illustrate the approach. Both problems use predator-prey interactions to boost the search. In contrast with traditional "single population" genetic algorithms (GAs), two populations constantly interact and co-evolve. However, the same algorithm can also be used with different types of co-evolutionary interactions. As an example, the symbiotic coevolution of solutions and genetic representations is shown to provide an elegant solution to the problem of finding a suitable genetic representation. The approach presented here greatly profits from the partial and continuous nature of LTFE. Noise tolerance is one advantage. Even more important, LTFE is ideally suited to deal with coupled fitness landscapes typical for coevolution.

  5. Genetic Algorithms: Basic Concept and Applications

    Directory of Open Access Journals (Sweden)

    Ms. Amninder Kaur

    2013-07-01

    Full Text Available Genetic algorithms are a part of evolutionary computing, which is a rapidly growing area of artificial intelligence. Genetic algorithms have been applied to a wide range of practical problems often with valuable results. Genetic algorithms are often viewed as function optimizer, although the range of problems to which genetic algorithms have been applied are quite broad. This paper covers the basic concepts of genetic algorithms and their applications to a variety of fields. It also tries to give a solution to the problem of economic load dispatch using Genetic Algorithms. An attempt has been made to explain when and why GA should be used as an optimization tool. Finally, the paper points to future directions

  6. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  7. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  8. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  9. Genetic Homogenization of Composite Materials

    Directory of Open Access Journals (Sweden)

    P. Tobola

    2009-04-01

    Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.

  10. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  11. Quantum Computing

    CERN Document Server

    Steane, A M

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarise not just quantum computing, but the whole subject of quantum information theory. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, the review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the EPR experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from classical information theory, and, arguably, quantum from classical physics. Basic quantum information ideas are described, including key distribution, teleportation, data compression, quantum error correction, the universal quantum computer and qua...

  12. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...... evolutionary algorithms, such as memetic algorithms, which have emerged as a very promising tool for solving many real-world problems in a multitude of areas of science and technology. Moreover, parallel evolutionary combinatorial optimization has been presented. Search operators, which are crucial in all...

  13. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  14. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  15. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  16. SOLUTION OF NONLINEAR PROBLEMS IN WATER RESOURCES SYSTEMS BY GENETIC ALGORITHM

    Directory of Open Access Journals (Sweden)

    Ahmet BAYLAR

    1998-03-01

    Full Text Available Genetic Algorithm methodology is a genetic process treated on computer which is considering evolution process in the nature. The genetic operations takes place within the chromosomes stored in computer memory. By means of various operators, the genetic knowledge in chromosomes change continuously and success of the community progressively increases as a result of these operations. The primary purpose of this study is calculation of nonlinear programming problems in water resources systems by Genetic Algorithm. For this purpose a Genetic Algoritm based optimization program were developed. It can be concluded that the results obtained from the genetic search based method give the precise results.

  17. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  18. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  19. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  20. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  1. Computational Deception

    NARCIS (Netherlands)

    Nijholt, Antinus; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our

  2. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  3. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  4. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    on Theory of Computing, pages 25-334, May 2000. [3]Tal Rabin and Michael Ben-Or. Verifiable secret sharing and multiparty protocols with honest majority (extended abstract). In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 73-85, Seattle, Washington, 15-17 May 1989.......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... an impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...

  5. Genetic circuit design automation.

    Science.gov (United States)

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization.

  6. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  7. Computationally intelligent pulsed photoacoustics

    Science.gov (United States)

    Lukić, Mladena; Ćojbašić, Žarko; Rabasović, Mihailo D.; Markushev, Dragan D.

    2014-12-01

    In this paper, the application of computational intelligence in pulsed photoacoustics is discussed. Feedforward multilayer perception networks are applied for real-time simultaneous determination of the laser beam spatial profile and vibrational-to-translational relaxation time of the polyatomic molecules in gases. Networks are trained and tested with theoretical data adjusted for a given experimental set-up. Genetic optimization has been used for calculation of the same parameters, fitting the photoacoustic signals with a different number of generations. Observed benefits from the application of computational intelligence in pulsed photoacoustics and advantages over previously developed methods are discussed, such as real-time operation, high precision and the possibility of finding solutions in a wide range of parameters, similar to in experimental conditions. In addition, the applicability for practical uses, such as the real-time in situ measurements of atmospheric pollutants, along with possible further developments of obtained results, is argued.

  8. Genetics and Rheumatic Disease

    Science.gov (United States)

    ... Well with Rheumatic Disease Genetics and Rheumatic Disease Genetics and Rheumatic Disease Fast Facts Studying twins has ... 70%, and for non-identical pairs, even lower. Genetics and ankylosing spondylitis Each rheumatic disease has its ...

  9. Applying the New Genetics

    Science.gov (United States)

    Sorenson, James

    1976-01-01

    New developments in the prediction and treatment of genetic diseases are presented. Genetic counseling and the role of the counselor, and rights of individuals to reproduce versus societal impact of genetic disorders, are discussed. (RW)

  10. Genetics and Rheumatic Disease

    Science.gov (United States)

    ... Well with Rheumatic Disease Genetics and Rheumatic Disease Genetics and Rheumatic Disease Fast Facts Studying twins has ... 70%, and for non-identical pairs, even lower. Genetics and ankylosing spondylitis Each rheumatic disease has its ...

  11. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  12. Machine learning in genetics and genomics

    Science.gov (United States)

    Libbrecht, Maxwell W.; Noble, William Stafford

    2016-01-01

    The field of machine learning promises to enable computers to assist humans in making sense of large, complex data sets. In this review, we outline some of the main applications of machine learning to genetic and genomic data. In the process, we identify some recurrent challenges associated with this type of analysis and provide general guidelines to assist in the practical application of machine learning to real genetic and genomic data. PMID:25948244

  13. Chromatin computation.

    Directory of Open Access Journals (Sweden)

    Barbara Bryant

    Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.

  14. Genetics Home Reference: vitiligo

    Science.gov (United States)

    ... physical functioning. However, concerns about appearance and ethnic identity are significant issues for many affected ... What information about a genetic condition can statistics provide? Why are some genetic ...

  15. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  16. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  17. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  18. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  19. Genetic aspects and genetic epidemiology of parasomnias.

    Science.gov (United States)

    Hublin, Christer; Kaprio, Jaakko

    2003-10-01

    Parasomnias are undesirable phenomena associated with sleep. Many of them run in families, and genetic factors have been long suggested to be involved in their occurrence. This article reviews the present knowledge of the genetics of the major classical behavioral parasomnias as well as present results from genetic epidemiological studies. The level and type of evidence for genetic effects varies much from parasomnia to parasomnia. The genetic factors are best established in enuresis, with several linkages to chromosomal loci, but their functions are not so far known. Environmental causes and gene-environment interactions are most probably also of great importance in the origin of complex traits or disorders such as parasomnias.

  20. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  1. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  2. Quantum Computers

    Science.gov (United States)

    2010-03-04

    efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms

  3. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  4. Large Scale Explorative Oligonucleotide Probe Selection for Thousands of Genetic Groups on a Computing Grid: Application to Phylogenetic Probe Design Using a Curated Small Subunit Ribosomal RNA Gene Database

    Directory of Open Access Journals (Sweden)

    Faouzi Jaziri

    2014-01-01

    Full Text Available Phylogenetic Oligonucleotide Arrays (POAs were recently adapted for studying the huge microbial communities in a flexible and easy-to-use way. POA coupled with the use of explorative probes to detect the unknown part is now one of the most powerful approaches for a better understanding of microbial community functioning. However, the selection of probes remains a very difficult task. The rapid growth of environmental databases has led to an exponential increase of data to be managed for an efficient design. Consequently, the use of high performance computing facilities is mandatory. In this paper, we present an efficient parallelization method to select known and explorative oligonucleotide probes at large scale using computing grids. We implemented a software that generates and monitors thousands of jobs over the European Computing Grid Infrastructure (EGI. We also developed a new algorithm for the construction of a high-quality curated phylogenetic database to avoid erroneous design due to bad sequence affiliation. We present here the performance and statistics of our method on real biological datasets based on a phylogenetic prokaryotic database at the genus level and a complete design of about 20,000 probes for 2,069 genera of prokaryotes.

  5. Computational oncology.

    Science.gov (United States)

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  6. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  7. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  8. The genetics of immunity.

    Science.gov (United States)

    Lazzaro, Brian P; Schneider, David S

    2014-06-17

    In this commentary, Brian P. Lazzaro and David S. Schneider examine the topic of the Genetics of Immunity as explored in this month's issues of GENETICS and G3: Genes|Genomes|Genetics. These inaugural articles are part of a joint Genetics of Immunity collection (ongoing) in the GSA journals. Copyright © 2014 Lazzaro and Schneider.

  9. Genetic Algorithm Based Proportional Integral Controller Design for Induction Motor

    Directory of Open Access Journals (Sweden)

    Mohanasundaram Kuppusamy

    2011-01-01

    Full Text Available Problem statement: This study has expounded the application of evolutionary computation method namely Genetic Algorithm (GA for estimation of feedback controller parameters for induction motor. GA offers certain advantages such as simple computational steps, derivative free optimization, reduced number of iterations and assured near global optima. The development of the method is well documented and computed and measured results are presented. Approach: The design of PI controller parameter for three phase induction motor drives was done using Genetic Algorithm. The objective function of motor current reduction, using PI controller, at starting is formulated as an optimization problem and solved with Genetic Algorithm. Results: The results showed the selected values of PI controller parameter using genetic algorithm approach, with objective of induction motor starting current reduction. Conclusions/Recommendation: The results proved the robustness and easy implementation of genetic algorithm selection of PI parameters for induction motor starting.

  10. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  11. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  12. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  13. Computational Artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature of that wh...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  14. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  15. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  16. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  17. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  18. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  19. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  20. Computer immunology.

    Science.gov (United States)

    Forrest, Stephanie; Beauchemin, Catherine

    2007-04-01

    This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.

  1. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  2. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  3. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  4. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...... set of skills rather than one single skill. Skills acquisition at these layers can be tailored to the specific needs of students. The work presented here builds upon experience from courses for such students from the Humanities in which programming is taught as a tool for other purposes. Results...

  5. Theory and practice in quantitative genetics.

    Science.gov (United States)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C; van Baal, G Caroline M; von Hjelmborg, Jacob B; Iachine, Ivan; Boomsma, Dorret I

    2003-10-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each, we show how the theoretical biometrical model can be translated into algebraic equations that may be used to generate scripts for statistical genetic software packages, such as Mx, Lisrel, SOLAR, or MERLIN. For using the former program a web-library (available from http://www.psy.vu.nl/mxbib) has been developed of freely available scripts that can be used to conduct all genetic analyses described in this paper.

  6. Computer science and operations research

    CERN Document Server

    Balci, Osman

    1992-01-01

    The interface of Operation Research and Computer Science - although elusive to a precise definition - has been a fertile area of both methodological and applied research. The papers in this book, written by experts in their respective fields, convey the current state-of-the-art in this interface across a broad spectrum of research domains which include optimization techniques, linear programming, interior point algorithms, networks, computer graphics in operations research, parallel algorithms and implementations, planning and scheduling, genetic algorithms, heuristic search techniques and dat

  7. Genetic engineering, medicine and medical genetics.

    Science.gov (United States)

    Motulsky, A G

    1984-01-01

    The impact of DNA technology in the near future will be on the manufacture of biologic agents and reagents that will lead to improved therapy and diagnosis. The use of DNA technology for prenatal and preclinical diagnosis in genetic diseases is likely to affect management of genetic diseases considerably. New and old questions regarding selective abortion and the psychosocial impact of early diagnosis of late appearing diseases and of genetic susceptibilities are being raised. Somatic therapy with isolated genes to treat disease has not been achieved. True germinal genetic engineering is far off for humans but may find applications in animal agriculture.

  8. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  9. Population genetics without intraspecific data

    DEFF Research Database (Denmark)

    Thorne, Jeffrey L; Choi, Sang Chul; Yu, Jiaye

    2007-01-01

    A central goal of computational biology is the prediction of phenotype from DNA and protein sequence data. Recent models of sequence change use in silico prediction systems to incorporate the effects of phenotype on evolutionary rates. These models have been designed for analyzing sequence data...... populations, and parameters of interspecific models should have population genetic interpretations. We show, with two examples, how population genetic interpretations can be assigned to evolutionary models. The first example considers the impact of RNA secondary structure on sequence change, and the second...... reflects the tendency for protein tertiary structure to influence nonsynonymous substitution rates. We argue that statistical fit to data should not be the sole criterion for assessing models of sequence change. A good interspecific model should also yield a clear and biologically plausible population...

  10. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  11. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  12. Computational Physics.

    Science.gov (United States)

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  13. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  14. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  15. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  16. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  17. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  18. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  19. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  20. Parallel Genetic Algorithm Based on the MPI Environment

    OpenAIRE

    2012-01-01

    Current genetic algorithm require both management of huge amounts of data and heavy computation, fulfilling these requirements calls for simple ways to implement parallel computing. In this paper, serial genetic algorithm was designed to parallel GA; this technology appears to be particularly well adapted to this task. Here we introduce two related mechanism: elite reserve strategy and MPI. The first can increase the possible to get the optimal solution of the population, while the message pa...

  1. Quantum-Inspired Genetic Algorithm or Quantum Genetic Algorithm: Which Is It?

    Science.gov (United States)

    Jones, Erika

    2015-04-01

    Our everyday work focuses on genetic algorithms (GAs) related to quantum computing where we call ``related'' algorithms those falling into one of two classes: (1) GAs run on classical computers but making use of quantum mechanical (QM) constructs and (2) GAs run on quantum hardware. Though convention has yet to be set with respect to usage of the accepted terms quantum-inspired genetic algorithm (QIGA) and quantum genetic algorithm (QGA), we find the two terms highly suitable respectively as labels for the aforementioned classes. With these specific definitions in mind, the difference between the QIGA and QGA is greater than might first be appreciated, particularly by those coming from a perspective emphasizing GA use as a general computational tool irrespective of QM aspects (1) suggested by QIGAs and (2) inherent in QGAs. We offer a theoretical standpoint highlighting key differences-both obvious, and more significantly, subtle-to be considered in general design of a QIGA versus that of a QGA.

  2. Application of Genetic Algorithm in the Layout of Fixture Components

    Institute of Scientific and Technical Information of China (English)

    焦黎; 孙厚芳

    2003-01-01

    Automation in the layout of fixture components is important to achieve efficiency and flexibility in computer aided fixture design. Based on basic genetic algorithm and particulars of different fixture components, a method of layout space division is presented. Such techniques as suitable crossover rate, mutation rate and selection arithmetic element are adopted in the genetic operation. The results show that genetic algorithm can effectively be applied in the automatic layout of fixture components.

  3. Genetic Algorithm for Chinese Postman Problems

    Institute of Scientific and Technical Information of China (English)

    Jiang Hua; Kang Li-shan

    2003-01-01

    Chinese Postman Problem is an unsettled graphic problem. It was approached seldom by evolutionary computation. Now we use genetic algorithm to solve Chinese Postman Problem in undirected graph and get good results. It could be extended to solve Chinese postman problem in directed graph. We make these efforts for exploring in optimizing the mixed Chinese postman problem.

  4. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built...

  5. Optimization of genomic selection training populations with a genetic algorithm

    Science.gov (United States)

    In this article, we derive a computationally efficient statistic to measure the reliability of estimates of genetic breeding values for a fixed set of genotypes based on a given training set of genotypes and phenotypes. We adopt a genetic algorithm scheme to find a training set of certain size from ...

  6. Semiclassical genetic algorithm with quantum crossover and mutation operations

    CERN Document Server

    SaiToh, Akira; Nakahara, Mikio

    2012-01-01

    In order for finding a good individual for a given fitness function in the context of evolutionary computing, we introduce a novel semiclassical quantum genetic algorithm. It has both of quantum crossover and quantum mutation procedures unlike conventional quantum genetic algorithms. A complexity analysis shows a certain improvement over its classical counterpart.

  7. High School Students' Use of Meiosis When Solving Genetics Problems.

    Science.gov (United States)

    Wynne, Cynthia F.; Stewart, Jim; Passmore, Cindy

    2001-01-01

    Paints a different picture of students' reasoning with meiosis as they solved complex, computer-generated genetics problems, some of which required them to revise their understanding of meiosis in response to anomalous data. Students were able to develop a rich understanding of meiosis and can utilize that knowledge to solve genetics problems.…

  8. High School Students' Use of Meiosis When Solving Genetics Problems.

    Science.gov (United States)

    Wynne, Cynthia F.; Stewart, Jim; Passmore, Cindy

    2001-01-01

    Paints a different picture of students' reasoning with meiosis as they solved complex, computer-generated genetics problems, some of which required them to revise their understanding of meiosis in response to anomalous data. Students were able to develop a rich understanding of meiosis and can utilize that knowledge to solve genetics problems.…

  9. Programming cells: towards an automated 'Genetic Compiler'.

    Science.gov (United States)

    Clancy, Kevin; Voigt, Christopher A

    2010-08-01

    One of the visions of synthetic biology is to be able to program cells using a language that is similar to that used to program computers or robotics. For large genetic programs, keeping track of the DNA on the level of nucleotides becomes tedious and error prone, requiring a new generation of computer-aided design (CAD) software. To push the size of projects, it is important to abstract the designer from the process of part selection and optimization. The vision is to specify genetic programs in a higher-level language, which a genetic compiler could automatically convert into a DNA sequence. Steps towards this goal include: defining the semantics of the higher-level language, algorithms to select and assemble parts, and biophysical methods to link DNA sequence to function. These will be coupled to graphic design interfaces and simulation packages to aid in the prediction of program dynamics, optimize genes, and scan projects for errors.

  10. Basic genetics for dermatologists

    Directory of Open Access Journals (Sweden)

    Muthu Sendhil Kumaran

    2013-01-01

    Full Text Available During the past few decades, advances in the field of molecular genetics have enriched us in understanding the pathogenesis of diseases, their identification, and appropriate therapeutic interventions. In the last 20 years, genetic basis of more than 350 monogenic skin diseases have been elucidated and is counting. The widespread use of molecular genetics as a tool in diagnosis is not practiced routinely due to genetic heterogenicity, limited access and low sensitivity. In this review, we have presented the very basics of genetics so as to enable dermatologists to have working understanding of medical genetics.

  11. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  12. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    Science.gov (United States)

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  13. Coarse-Grained Parallel Genetic Algorithm to solve the Shortest Path Routing problem using Genetic operators

    Directory of Open Access Journals (Sweden)

    V.PURUSHOTHAM REDDY

    2011-02-01

    Full Text Available In computer networks the routing is based on shortest path routing algorithms. Based on its advantages, an alternative method is used known as Genetic Algorithm based routing algorithm, which is highly scalable and insensitive to variations in network topology. Here we propose a coarse-grained parallel genetic algorithm to solve the shortest path routing problem with the primary goal of computation time reduction along with the use of migration scheme. This algorithm is developed and implemented on an MPI cluster. The effects of migration and its performance is studied in this paper.

  14. Computational Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  15. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  16. Computational Physics

    Science.gov (United States)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  17. Computational Electromagnetics

    Science.gov (United States)

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  18. Computer files.

    Science.gov (United States)

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Everything Computes

    Institute of Scientific and Technical Information of China (English)

    Bill; Hofmann

    1999-01-01

    Dear American Professor, I am a student in Beijing. At the beginning of last semester, we fourroommates gathered some 10,000 yuan (a big sum here. approximately 1150USD ) and bought a computer, which is our joint-property. Since the computercame into our room, it was used round the clock except the time we were havingc1asses. So even at midnight, when I woke up from the dream, I could still see

  20. Genetics Home Reference

    Science.gov (United States)

    Skip Navigation Bar Home Current Issue Past Issues Genetics Home Reference Past Issues / Spring 2007 Table of ... of this page please turn Javascript on. The Genetics Home Reference (GHR) Web site — ghr.nlm.nih. ...

  1. Genetics of Hearing Loss

    Science.gov (United States)

    ... in Latin America Information For... Media Policy Makers Genetics of Hearing Loss Language: English Español (Spanish) Recommend ... of hearing loss in babies is due to genetic causes. There are also a number of things ...

  2. Frontotemporal Dementia: Genetics

    Science.gov (United States)

    ... Calendar of Events Fundraising Events Conferences Press Releases Genetics of FTD After receiving a diagnosis of FTD ... that recent advances in science have brought the genetics of FTD into much better focus. In 2012, ...

  3. Genetic Disease Foundation

    Science.gov (United States)

    ... mission to help prevent, manage and treat inherited genetic diseases. View our latest News Brief here . You can ... contributions to the diagnosis, prevention and treatment of genetic diseases. Learn how advances at Mount Sinai have impacted ...

  4. Genetic Brain Disorders

    Science.gov (United States)

    A genetic brain disorder is caused by a variation or a mutation in a gene. A variation is a different form ... mutation is a change in a gene. Genetic brain disorders affect the development and function of the ...

  5. Genetics Home Reference

    Science.gov (United States)

    ... changes Browse A–Z Chromosomes & mtDNA Autosomes, sex chromosomes, and mitochondrial DNA (mtDNA) Browse Help Me Understand Genetics Learn about the basics of human genetics Browse New & Updated Pages New Pages Omenn ...

  6. Genetically engineered foods

    Science.gov (United States)

    Bioengineered foods; GMOs; Genetically modified foods ... helps speed up the process of creating new foods with desired traits. The possible benefits of genetic engineering include: More nutritious food Tastier food Disease- and ...

  7. Genetics of Parkinson's disease

    National Research Council Canada - National Science Library

    Klein, Christine; Westenberger, Ana

    2012-01-01

    Fifteen years of genetic research in Parkinson's disease (PD) have led to the identification of several monogenic forms of the disorder and of numerous genetic risk factors increasing the risk to develop PD...

  8. Prenatal screening and genetics

    DEFF Research Database (Denmark)

    Alderson, P; Aro, A R; Dragonas, T

    2001-01-01

    Although the term 'genetic screening' has been used for decades, this paper discusses how, in its most precise meaning, genetic screening has not yet been widely introduced. 'Prenatal screening' is often confused with 'genetic screening'. As we show, these terms have different meanings, and we...... examine definitions of the relevant concepts in order to illustrate this point. The concepts are i) prenatal, ii) genetic screening, iii) screening, scanning and testing, iv) maternal and foetal tests, v) test techniques and vi) genetic conditions. So far, prenatal screening has little connection...... with precisely defined genetics. There are benefits but also disadvantages in overstating current links between them in the term genetic screening. Policy making and professional and public understandings about screening could be clarified if the distinct meanings of prenatal screening and genetic screening were...

  9. Genetics Home Reference: hyperprolinemia

    Science.gov (United States)

    ... can also occur with other conditions, such as malnutrition or liver disease. In particular, individuals with conditions ... Topic: Amino Acid Metabolism Disorders Health Topic: Genetic Brain Disorders Health Topic: Newborn Screening Genetic and Rare ...

  10. Genetics Home Reference: hypermethioninemia

    Science.gov (United States)

    ... C. Mutations in human glycine N-methyltransferase give insights into its role in methionine metabolism. Hum Genet. ... healthcare professional . About Genetics Home Reference Site Map Customer Support Selection Criteria for Links USA.gov Copyright ...

  11. Genetic spectrum assignment model with constraints in cognitive radio networks

    Directory of Open Access Journals (Sweden)

    Fang Ye

    2011-06-01

    Full Text Available The interference constraints of genetic spectrum assignment model in cognitive radio networks are analyzed in this paper. An improved genetic spectrum assignment model is proposed. The population of genetic algorithm is divided into two sets, the feasible spectrum assignment strategies and the randomly updated spectrum assignment strategies. The penalty function is added to the utility function to achieve the spectrum assignment strategy that satisfies the interference constraints and has better fitness. The proposed method is applicable in both the genetic spectrum assignment model and the quantum genetic spectrum assignment mode. It can ensure the randomness of partial chromosomes in the population to some extent, and reduce the computational complexity caused by the constraints-free procedure after the update of population. Simulation results show that the proposed method can achieve better performance than the conventional genetic spectrum assignment model and quantum genetic spectrum assignment model

  12. The degeneracy of the genetic code and Hadamard matrices

    CERN Document Server

    Petoukhov, Sergey V

    2008-01-01

    The matrix form of the presentation of the genetic code is described as the cognitive form to analyze structures of the genetic code. A similar matrix form is utilized in the theory of signal processing. The Kronecker family of the genetic matrices is investigated, which is based on the genetic matrix [C A; U G], where C, A, U, G are the letters of the genetic alphabet. This matrix in the third Kronecker power is the (8*8)-matrix, which contains 64 triplets. Peculiarities of the degeneracy of the vertebrate mitochondria genetic code are reflected in the symmetrical black-and-white mosaic of this genetic (8*8)-matrix. This mosaic matrix is connected algorithmically with Hadamard matrices unexpectedly, which are famous in the theory of signal processing, quantum mechanics and quantum computers.

  13. Computer Spectrometers

    Science.gov (United States)

    Dattani, Nikesh S.

    2017-06-01

    Ideally, the cataloguing of spectroscopic linelists would not demand laborious and expensive experiments. Whatever an experiment might achieve, the same information would be attainable by running a calculation on a computer. Kolos and Wolniewicz were the first to demonstrate that calculations on a computer can outperform even the most sophisticated molecular spectroscopic experiments of the time, when their 1964 calculations of the dissociation energies of H_2 and D_{2} were found to be more than 1 cm^{-1} larger than the best experiments by Gerhard Herzberg, suggesting the experiment violated a strict variational principle. As explained in his Nobel Lecture, it took 5 more years for Herzberg to perform an experiment which caught up to the accuracy of the 1964 calculations. Today, numerical solutions to the Schrödinger equation, supplemented with relativistic and higher-order quantum electrodynamics (QED) corrections can provide ro-vibrational spectra for molecules that we strongly believe to be correct, even in the absence of experimental data. Why do we believe these calculated spectra are correct if we do not have experiments against which to test them? All evidence seen so far suggests that corrections due to gravity or other forces are not needed for a computer simulated QED spectrum of ro-vibrational energy transitions to be correct at the precision of typical spectrometers. Therefore a computer-generated spectrum can be considered to be as good as one coming from a more conventional spectrometer, and this has been shown to be true not just for the H_2 energies back in 1964, but now also for several other molecules. So are we at the stage where we can launch an array of calculations, each with just the atomic number changed in the input file, to reproduce the NIST energy level databases? Not quite. But I will show that for the 6e^- molecule Li_2, we have reproduced the vibrational spacings to within 0.001 cm^{-1} of the experimental spectrum, and I will

  14. Genetics in psychiatry.

    Science.gov (United States)

    Umesh, Shreekantiah; Nizamie, Shamshul Haque

    2014-04-01

    Today, psychiatrists are focusing on genetics aspects of various psychiatric disorders not only for a future classification of psychiatric disorders but also a notion that genetics would aid in the development of new medications to treat these disabling illnesses. This review therefore emphasizes on the basics of genetics in psychiatry as well as focuses on the emerging picture of genetics in psychiatry and their future implications.

  15. Behavioral genetics and taste

    Directory of Open Access Journals (Sweden)

    Bachmanov Alexander A

    2007-09-01

    Full Text Available Abstract This review focuses on behavioral genetic studies of sweet, umami, bitter and salt taste responses in mammals. Studies involving mouse inbred strain comparisons and genetic analyses, and their impact on elucidation of taste receptors and transduction mechanisms are discussed. Finally, the effect of genetic variation in taste responsiveness on complex traits such as drug intake is considered. Recent advances in development of genomic resources make behavioral genetics a powerful approach for understanding mechanisms of taste.

  16. Restart-Based Genetic Algorithm for the Quadratic Assignment Problem

    Science.gov (United States)

    Misevicius, Alfonsas

    The power of genetic algorithms (GAs) has been demonstrated for various domains of the computer science, including combinatorial optimization. In this paper, we propose a new conceptual modification of the genetic algorithm entitled a "restart-based genetic algorithm" (RGA). An effective implementation of RGA for a well-known combinatorial optimization problem, the quadratic assignment problem (QAP), is discussed. The results obtained from the computational experiments on the QAP instances from the publicly available library QAPLIB show excellent performance of RGA. This is especially true for the real-life like QAPs.

  17. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  18. Report: Human cancer genetics

    Institute of Scientific and Technical Information of China (English)

    LI Marilyn; ALBERTSON Donna

    2006-01-01

    The short report will be focused on the genetic basis and possible mechanisms of tumorigenesis, common types of cancer, the importance of genetic diagnosis of cancer, and the methodology of cancer genetic diagnosis. They will also review presymptomatic testing of hereditary cancers, and the application of expression profiling to identify patients likely to benefit from particular therapeutic approaches.

  19. Prenatal screening and genetics

    NARCIS (Netherlands)

    Alderson, P.; Aro, A.R.; Dragonas, T.; Ettorre, E.; Hemminki, E.; Jalinoja, P.; Santalahti, P.; Tijmstra, T.

    Although the term 'genetic screening' has been used for decades, this paper discusses how, in its most precise meaning, genetic screening has not yet been widely introduced. 'Prenatal screening' is often confused with 'genetic screening'. As we show, these terms have different meanings, and we

  20. Human cancer genetics*

    OpenAIRE

    2006-01-01

    The short report will be focused on the genetic basis and possible mechanisms of tumorigenesis, common types of cancer, the importance of genetic diagnosis of cancer, and the methodology of cancer genetic diagnosis. They will also review presymptomatic testing of hereditary cancers, and the application of expression profiling to identify patients likely to benefit from particular therapeutic approaches.

  1. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  2. Prenatal screening and genetics

    NARCIS (Netherlands)

    Alderson, P.; Aro, A.R.; Dragonas, T.; Ettorre, E.; Hemminki, E.; Jalinoja, P.; Santalahti, P.; Tijmstra, T.

    2001-01-01

    Although the term 'genetic screening' has been used for decades, this paper discusses how, in its most precise meaning, genetic screening has not yet been widely introduced. 'Prenatal screening' is often confused with 'genetic screening'. As we show, these terms have different meanings, and we exami

  3. Prenatal screening and genetics

    DEFF Research Database (Denmark)

    Alderson, P; Aro, A R; Dragonas, T

    2001-01-01

    Although the term 'genetic screening' has been used for decades, this paper discusses how, in its most precise meaning, genetic screening has not yet been widely introduced. 'Prenatal screening' is often confused with 'genetic screening'. As we show, these terms have different meanings, and we ex...

  4. GENETICS AND GENOMICS OF PLANT GENETIC RESOURCES

    Directory of Open Access Journals (Sweden)

    Börner A.

    2012-08-01

    Full Text Available Plant genetic resources play a major role for global food security. The most significant and widespread mean of conserving plant genetic resources is ex situ conservation. Most conserved accessions are kept in specialized facilities known as genebanks maintained by public or private institutions. World-wide 7.4 million accessions are stored in about 1,500 ex situ genebanks.In addition, series of genetic stocks including chromosome substitution lines, alloplasmic lines, single chromosome recombinant lines, introgression lines, etc. have been created. Analysing these genetic stocks many qualitative and quantitative inherited traits were associated to certain chromosomes, chromosome arms or introgressed segments. Today, genetic stocks are supplemented by a huge number of genotyped mapping populations. Beside progenies of bi-parental crosses (doubled haploid lines, recombinant inbred lines, etc. panels for association mapping were created recently.In our presentation we give examples for the successful utilisation of genebank accessions and genetic stocks for genetic and genomic studies. Using both segregation and association mapping approaches, data on mapping of loci/marker trait associations for a range of different traits are presented.

  5. Feline genetics: clinical applications and genetic testing.

    Science.gov (United States)

    Lyons, Leslie A

    2010-11-01

    DNA testing for domestic cat diseases and appearance traits is a rapidly growing asset for veterinary medicine. Approximately 33 genes contain 50 mutations that cause feline health problems or alterations in the cat's appearance. A variety of commercial laboratories can now perform cat genetic diagnostics, allowing both the veterinary clinician and the private owner to obtain DNA test results. DNA is easily obtained from a cat via a buccal swab with a standard cotton bud or cytological brush, allowing DNA samples to be easily sent to any laboratory in the world. The DNA test results identify carriers of the traits, predict the incidence of traits from breeding programs, and influence medical prognoses and treatments. An overall goal of identifying these genetic mutations is the correction of the defect via gene therapies and designer drug therapies. Thus, genetic testing is an effective preventative medicine and a potential ultimate cure. However, genetic diagnostic tests may still be novel for many veterinary practitioners and their application in the clinical setting needs to have the same scrutiny as any other diagnostic procedure. This article will review the genetic tests for the domestic cat, potential sources of error for genetic testing, and the pros and cons of DNA results in veterinary medicine. Highlighted are genetic tests specific to the individual cat, which are a part of the cat's internal genome.

  6. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  7. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  8. Massively Parallel Genetics.

    Science.gov (United States)

    Shendure, Jay; Fields, Stanley

    2016-06-01

    Human genetics has historically depended on the identification of individuals whose natural genetic variation underlies an observable trait or disease risk. Here we argue that new technologies now augment this historical approach by allowing the use of massively parallel assays in model systems to measure the functional effects of genetic variation in many human genes. These studies will help establish the disease risk of both observed and potential genetic variants and to overcome the problem of "variants of uncertain significance." Copyright © 2016 by the Genetics Society of America.

  9. Primer on genetic counseling.

    Science.gov (United States)

    Hahn, Susan Estabrooks

    2011-04-01

    Once limited to rare mendelian disorders, genetic counseling is playing an ever-increasing role in the multidisciplinary approach to predicting, diagnosing, and managing neurologic disease. However, genetic counseling services may not be optimized because of lack of availability and lack of knowledge regarding when it is appropriate to refer, what occurs in genetic counseling, and how genetic counseling can affect care. These issues are addressed in this article, along with corresponding clinical scenarios. Websites to find genetic counseling services and resources are also provided.

  10. The genetic and economic effect of preliminary culling in the seedling orchard

    Science.gov (United States)

    Don E. Riemenschneider

    1977-01-01

    The genetic and economic effects of two stages of truncation selection in a white spruce seedling orchard were investigated by computer simulation. Genetic effects were computed by assuming a bivariate distribution of juvenile and mature traits and volume was used as the selection criterion. Seed production was assumed to rise in a linear fashion to maturity and then...

  11. Tensor computations in computer algebra systems

    CERN Document Server

    Korolkova, A V; Sevastyanov, L A

    2014-01-01

    This paper considers three types of tensor computations. On their basis, we attempt to formulate criteria that must be satisfied by a computer algebra system dealing with tensors. We briefly overview the current state of tensor computations in different computer algebra systems. The tensor computations are illustrated with appropriate examples implemented in specific systems: Cadabra and Maxima.

  12. How Is Genetic Testing Done?

    Science.gov (United States)

    ... Testing How is genetic testing done? How is genetic testing done? Once a person decides to proceed with ... is called informed consent . For more information about genetic testing procedures: The National Society of Genetic Counselors offers ...

  13. Computational crystallization.

    Science.gov (United States)

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  14. Study Development of the Cardiac Computer Simulations

    Institute of Scientific and Technical Information of China (English)

    VOLKERHellemanns; ZHANGHong; SEKOUSingare; ZHANGZhen-xi; KONGXiang-yun

    2004-01-01

    The technique of computer simulations is a very efficient method in investigating mechanisms of many diseases. This paper reviews how the simulations of the human heart started as a simple mathematical models in the past and developed to the point where genetic information is needed to do suitable work like finding out new medicaments against heart diseases. Also the Influence of the development of computer performance in the future as well as the data presentation is described.

  15. BPA genetic monitoring - BPA Genetic Monitoring Project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Initiated in 1989, this study monitors genetic changes associated with hatchery propagation in multiple Snake River sub-basins for Chinook salmon and steelhead. We...

  16. Molecular genetics made simple

    Directory of Open Access Journals (Sweden)

    Heba Sh. Kassem

    2012-07-01

    Full Text Available Genetics have undoubtedly become an integral part of biomedical science and clinical practice, with important implications in deciphering disease pathogenesis and progression, identifying diagnostic and prognostic markers, as well as designing better targeted treatments. The exponential growth of our understanding of different genetic concepts is paralleled by a growing list of genetic terminology that can easily intimidate the unfamiliar reader. Rendering genetics incomprehensible to the clinician however, defeats the very essence of genetic research: its utilization for combating disease and improving quality of life. Herein we attempt to correct this notion by presenting the basic genetic concepts along with their usefulness in the cardiology clinic. Bringing genetics closer to the clinician will enable its harmonious incorporation into clinical care, thus not only restoring our perception of its simple and elegant nature, but importantly ensuring the maximal benefit for our patients.

  17. Genetic interest assessment

    Science.gov (United States)

    Doughney, Erin

    Genetics is becoming increasingly integrated into peoples' lives. Different measures have been taken to try and better genetics education. This thesis examined undergraduate students at the University of North Texas not majoring in the life sciences interest in genetic concepts through the means of a Likert style survey. ANOVA analysis showed there was variation amongst the interest level in different genetic concepts. In addition age and lecture were also analyzed as contributing factors to students' interest. Both age and lecture were evaluated to see if they contributed to the interest of students in genetic concepts and neither showed statistical significance. The Genetic Interest Assessment (GIA) serves to help mediate the gap between genetic curriculum and students' interest.

  18. Rewriting the Genetic Code.

    Science.gov (United States)

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  19. Optimal Trend Tests for Genetic Association Studies of Heterogeneous Diseases.

    Science.gov (United States)

    Lee, Wen-Chung

    2016-06-09

    The Cochran-Armitage trend test is a standard procedure in genetic association studies. It is a directed test with high power to detect genetic effects that follow the gene-dosage model. In this paper, the author proposes optimal trend tests for genetic association studies of heterogeneous diseases. Monte-Carlo simulations show that the power gain of the optimal trend tests over the conventional Cochran-Armitage trend test is striking when the genetic effects are heterogeneous. The easy-to-use R 3.1.2 software (R Foundation for Statistical Computing, Vienna, Austria) code is provided. The optimal trend tests are recommended for routine use.

  20. A Parallel Genetic Simulated Annealing Hybrid Algorithm for Task Scheduling

    Institute of Scientific and Technical Information of China (English)

    SHU Wanneng; ZHENG Shijue

    2006-01-01

    In this paper combined with the advantages of genetic algorithm and simulated annealing, brings forward a parallel genetic simulated annealing hybrid algorithm (PGSAHA) and applied to solve task scheduling problem in grid computing .It first generates a new group of individuals through genetic operation such as reproduction, crossover, mutation, etc, and than simulated anneals independently all the generated individuals respectively.When the temperature in the process of cooling no longer falls, the result is the optimal solution on the whole.From the analysis and experiment result, it is concluded that this algorithm is superior to genetic algorithm and simulated annealing.

  1. Biotechnology Computing: Information Science for the Era of Molecular Medicine.

    Science.gov (United States)

    Masys, Daniel R.

    1989-01-01

    The evolution from classical genetics to biotechnology, an area of research involving key macromolecules in living cells, is chronicled and the current state of biotechnology is described, noting related advances in computing and clinical medicine. (MSE)

  2. ROBUST-HYBRID GENETIC ALGORITHM FOR A FLOW-SHOP SCHEDULING PROBLEM (A Case Study at PT FSCM Manufacturing Indonesia

    Directory of Open Access Journals (Sweden)

    Johan Soewanda

    2007-01-01

    Full Text Available This paper discusses the application of Robust Hybrid Genetic Algorithm to solve a flow-shop scheduling problem. The proposed algorithm attempted to reach minimum makespan. PT. FSCM Manufacturing Indonesia Plant 4's case was used as a test case to evaluate the performance of the proposed algorithm. The proposed algorithm was compared to Ant Colony, Genetic-Tabu, Hybrid Genetic Algorithm, and the company's algorithm. We found that Robust Hybrid Genetic produces statistically better result than the company's, but the same as Ant Colony, Genetic-Tabu, and Hybrid Genetic. In addition, Robust Hybrid Genetic Algorithm required less computational time than Hybrid Genetic Algorithm

  3. Naturally selecting solutions: the use of genetic algorithms in bioinformatics.

    Science.gov (United States)

    Manning, Timmy; Sleator, Roy D; Walsh, Paul

    2013-01-01

    For decades, computer scientists have looked to nature for biologically inspired solutions to computational problems; ranging from robotic control to scheduling optimization. Paradoxically, as we move deeper into the post-genomics era, the reverse is occurring, as biologists and bioinformaticians look to computational techniques, to solve a variety of biological problems. One of the most common biologically inspired techniques are genetic algorithms (GAs), which take the Darwinian concept of natural selection as the driving force behind systems for solving real world problems, including those in the bioinformatics domain. Herein, we provide an overview of genetic algorithms and survey some of the most recent applications of this approach to bioinformatics based problems.

  4. Find the weakest link. A comparison between demographic, genetic and demo-genetic metapopulation extinction times

    Directory of Open Access Journals (Sweden)

    Robert Alexandre

    2011-09-01

    Full Text Available Abstract Background While the ultimate causes of most species extinctions are environmental, environmental constraints have various secondary consequences on evolutionary and ecological processes. The roles of demographic, genetic mechanisms and their interactions in limiting the viabilities of species or populations have stirred much debate and remain difficult to evaluate in the absence of demography-genetics conceptual and technical framework. Here, I computed projected times to metapopulation extinction using (1 a model focusing on the effects of species properties, habitat quality, quantity and temporal variability on the time to demographic extinction; (2 a genetic model focusing on the dynamics of the drift and inbreeding loads under the same species and habitat constraints; (3 a demo-genetic model accounting for demographic-genetic processes and feedbacks. Results Results indicate that a given population may have a high demographic, but low genetic viability or vice versa; and whether genetic or demographic aspects will be the most limiting to overall viability depends on the constraints faced by the species (e.g., reduction of habitat quantity or quality. As a consequence, depending on metapopulation or species characteristics, incorporating genetic considerations to demographically-based viability assessments may either moderately or severely reduce the persistence time. On the other hand, purely genetically-based estimates of species viability may either underestimate (by neglecting demo-genetic interactions or overestimate (by neglecting the demographic resilience true viability. Conclusion Unbiased assessments of the viabilities of species may only be obtained by identifying and considering the most limiting processes (i.e., demography or genetics, or, preferentially, by integrating them.

  5. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  6. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  7. Solving Maximal Clique Problem through Genetic Algorithm

    Science.gov (United States)

    Rajawat, Shalini; Hemrajani, Naveen; Menghani, Ekta

    2010-11-01

    Genetic algorithm is one of the most interesting heuristic search techniques. It depends basically on three operations; selection, crossover and mutation. The outcome of the three operations is a new population for the next generation. Repeating these operations until the termination condition is reached. All the operations in the algorithm are accessible with today's molecular biotechnology. The simulations show that with this new computing algorithm, it is possible to get a solution from a very small initial data pool, avoiding enumerating all candidate solutions. For randomly generated problems, genetic algorithm can give correct solution within a few cycles at high probability.

  8. Adaptive Genetic Algorithm Model for Intrusion Detection

    Directory of Open Access Journals (Sweden)

    K. S. Anil Kumar

    2012-09-01

    Full Text Available Intrusion detection systems are intelligent systems designed to identify and prevent the misuse of computer networks and systems. Various approaches to Intrusion Detection are currently being used, but they are relatively ineffective. Thus the emerging network security systems need be part of the life system and this ispossible only by embedding knowledge into the network. The Adaptive Genetic Algorithm Model - IDS comprising of K-Means clustering Algorithm, Genetic Algorithm and Neural Network techniques. Thetechnique is tested using multitude of background knowledge sets in DARPA network traffic datasets.

  9. Collective evolution and the genetic code.

    Science.gov (United States)

    Vetsigian, Kalin; Woese, Carl; Goldenfeld, Nigel

    2006-07-11

    A dynamical theory for the evolution of the genetic code is presented, which accounts for its universality and optimality. The central concept is that a variety of collective, but non-Darwinian, mechanisms likely to be present in early communal life generically lead to refinement and selection of innovation-sharing protocols, such as the genetic code. Our proposal is illustrated by using a simplified computer model and placed within the context of a sequence of transitions that early life may have made, before the emergence of vertical descent.

  10. Web application for genetic modification flux with database to estimate metabolic fluxes of genetic mutants.

    Science.gov (United States)

    Mohd Ali, Noorlin; Tsuboi, Ryo; Matsumoto, Yuta; Koishi, Daisuke; Inoue, Kentaro; Maeda, Kazuhiro; Kurata, Hiroyuki

    2016-07-01

    Computational analysis of metabolic fluxes is essential in understanding the structure and function of a metabolic network and in rationally designing genetically modified mutants for an engineering purpose. We had presented the genetic modification flux (GMF) that predicts the flux distribution of a broad range of genetically modified mutants. To enhance the feasibility and usability of GMF, we have developed a web application with a metabolic network database to predict a flux distribution of genetically modified mutants. One hundred and twelve data sets of Escherichia coli, Corynebacterium glutamicum, Saccharomyces cerevisiae, and Chinese hamster ovary were registered as standard models.

  11. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  12. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  13. Social Computing

    CERN Document Server

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  14. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  15. Brain computer

    Directory of Open Access Journals (Sweden)

    Sarah N. Abdulkader

    2015-07-01

    Full Text Available Brain computer interface technology represents a highly growing field of research with application systems. Its contributions in medical fields range from prevention to neuronal rehabilitation for serious injuries. Mind reading and remote communication have their unique fingerprint in numerous fields such as educational, self-regulation, production, marketing, security as well as games and entertainment. It creates a mutual understanding between users and the surrounding systems. This paper shows the application areas that could benefit from brain waves in facilitating or achieving their goals. We also discuss major usability and technical challenges that face brain signals utilization in various components of BCI system. Different solutions that aim to limit and decrease their effects have also been reviewed.

  16. Computational micromechanics

    Science.gov (United States)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  17. Modelling and optimization of computer network traffic controllers

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2005-01-01

    operation of the controller and evaluate the benefits of using a genetic algorithm approach to speed up the optimization process. Our results show that the use of the genetic algorithm proves particularly useful in reducing the computation time required to optimize the operation of a system consisting of multiple token-bucket-regulated sources.

  18. Synthesizing genetic sequential logic circuit with clock pulse generator.

    Science.gov (United States)

    Chuang, Chia-Hua; Lin, Chun-Liang

    2014-05-28

    Rhythmic clock widely occurs in biological systems which controls several aspects of cell physiology. For the different cell types, it is supplied with various rhythmic frequencies. How to synthesize a specific clock signal is a preliminary but a necessary step to further development of a biological computer in the future. This paper presents a genetic sequential logic circuit with a clock pulse generator based on a synthesized genetic oscillator, which generates a consecutive clock signal whose frequency is an inverse integer multiple to that of the genetic oscillator. An analogous electronic waveform-shaping circuit is constructed by a series of genetic buffers to shape logic high/low levels of an oscillation input in a basic sinusoidal cycle and generate a pulse-width-modulated (PWM) output with various duty cycles. By controlling the threshold level of the genetic buffer, a genetic clock pulse signal with its frequency consistent to the genetic oscillator is synthesized. A synchronous genetic counter circuit based on the topology of the digital sequential logic circuit is triggered by the clock pulse to synthesize the clock signal with an inverse multiple frequency to the genetic oscillator. The function acts like a frequency divider in electronic circuits which plays a key role in the sequential logic circuit with specific operational frequency. A cascaded genetic logic circuit generating clock pulse signals is proposed. Based on analogous implement of digital sequential logic circuits, genetic sequential logic circuits can be constructed by the proposed approach to generate various clock signals from an oscillation signal.

  19. Molecular Population Genetics

    Science.gov (United States)

    Casillas, Sònia; Barbadilla, Antonio

    2017-01-01

    Molecular population genetics aims to explain genetic variation and molecular evolution from population genetics principles. The field was born 50 years ago with the first measures of genetic variation in allozyme loci, continued with the nucleotide sequencing era, and is currently in the era of population genomics. During this period, molecular population genetics has been revolutionized by progress in data acquisition and theoretical developments. The conceptual elegance of the neutral theory of molecular evolution or the footprint carved by natural selection on the patterns of genetic variation are two examples of the vast number of inspiring findings of population genetics research. Since the inception of the field, Drosophila has been the prominent model species: molecular variation in populations was first described in Drosophila and most of the population genetics hypotheses were tested in Drosophila species. In this review, we describe the main concepts, methods, and landmarks of molecular population genetics, using the Drosophila model as a reference. We describe the different genetic data sets made available by advances in molecular technologies, and the theoretical developments fostered by these data. Finally, we review the results and new insights provided by the population genomics approach, and conclude by enumerating challenges and new lines of inquiry posed by increasingly large population scale sequence data. PMID:28270526

  20. Molecular Population Genetics.

    Science.gov (United States)

    Casillas, Sònia; Barbadilla, Antonio

    2017-03-01

    Molecular population genetics aims to explain genetic variation and molecular evolution from population genetics principles. The field was born 50 years ago with the first measures of genetic variation in allozyme loci, continued with the nucleotide sequencing era, and is currently in the era of population genomics. During this period, molecular population genetics has been revolutionized by progress in data acquisition and theoretical developments. The conceptual elegance of the neutral theory of molecular evolution or the footprint carved by natural selection on the patterns of genetic variation are two examples of the vast number of inspiring findings of population genetics research. Since the inception of the field, Drosophila has been the prominent model species: molecular variation in populations was first described in Drosophila and most of the population genetics hypotheses were tested in Drosophila species. In this review, we describe the main concepts, methods, and landmarks of molecular population genetics, using the Drosophila model as a reference. We describe the different genetic data sets made available by advances in molecular technologies, and the theoretical developments fostered by these data. Finally, we review the results and new insights provided by the population genomics approach, and conclude by enumerating challenges and new lines of inquiry posed by increasingly large population scale sequence data. Copyright © 2017 Casillas and Barbadilla.

  1. Function Optimization Based on Quantum Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Ying Sun

    2014-01-01

    Full Text Available Optimization method is important in engineering design and application. Quantum genetic algorithm has the characteristics of good population diversity, rapid convergence and good global search capability and so on. It combines quantum algorithm with genetic algorithm. A novel quantum genetic algorithm is proposed, which is called Variable-boundary-coded Quantum Genetic Algorithm (vbQGA in which qubit chromosomes are collapsed into variable-boundary-coded chromosomes instead of binary-coded chromosomes. Therefore much shorter chromosome strings can be gained. The method of encoding and decoding of chromosome is first described before a new adaptive selection scheme for angle parameters used for rotation gate is put forward based on the core ideas and principles of quantum computation. Eight typical functions are selected to optimize to evaluate the effectiveness and performance of vbQGA against standard Genetic Algorithm (sGA and Genetic Quantum Algorithm (GQA. The simulation results show that vbQGA is significantly superior to sGA in all aspects and outperforms GQA in robustness and solving velocity, especially for multidimensional and complicated functions.

  2. Genetic network models: a comparative study

    Science.gov (United States)

    van Someren, Eugene P.; Wessels, Lodewyk F. A.; Reinders, Marcel J. T.

    2001-06-01

    Currently, the need arises for tools capable of unraveling the functionality of genes based on the analysis of microarray measurements. Modeling genetic interactions by means of genetic network models provides a methodology to infer functional relationships between genes. Although a wide variety of different models have been introduced so far, it remains, in general, unclear what the strengths and weaknesses of each of these approaches are and where these models overlap and differ. This paper compares different genetic modeling approaches that attempt to extract the gene regulation matrix from expression data. A taxonomy of continuous genetic network models is proposed and the following important characteristics are suggested and employed to compare the models: inferential power; predictive power; robustness; consistency; stability and computational cost. Where possible, synthetic time series data are employed to investigate some of these properties. The comparison shows that although genetic network modeling might provide valuable information regarding genetic interactions, current models show disappointing results on simple artificial problems. For now, the simplest models are favored because they generalize better, but more complex models will probably prevail once their bias is more thoroughly understood and their variance is better controlled.

  3. Genetics Reasoning with Multiple External Representations

    Science.gov (United States)

    Tsui, Chi-Yan; Treagust, David F.

    2003-02-01

    This paper explores a case study of a class of Year 10 students (n=24) whose learning of genetics involved activities of BioLogica, a computer program that features multiple external representations (MERs). MERs can be verbal/textual, visual-graphical, or in other formats. Researchers claim that the functions of MERs in supporting student learning are to complement information or processes, to constrain the interpretation of abstract concepts, and to construct new viable conceptions. Over decades, research has shown that genetics remains linguistically and conceptually difficult for high school students. This case study using data from multiple sources enabled students'' development of genetics reasoning to be interpreted from an epistemological perspective. Pretest-posttest comparison after six weeks showed that most of the students (n=20) had improved their genetics reasoning but only for easier reasoning types. Findings indicated that the MERs in BioLogica contributed to students'' development of genetics reasoning by engendering their motivation and interest but only when students were mindful in their learning. Based on triangulation of data from multiple sources, MERs in BioLogica appeared to support learning largely by constraining students'' interpretation of phenomena of genetics.

  4. Classical and quantum computing with C++ and Java simulations

    CERN Document Server

    Hardy, Y

    2001-01-01

    Classical and Quantum computing provides a self-contained, systematic and comprehensive introduction to all the subjects and techniques important in scientific computing. The style and presentation are readily accessible to undergraduates and graduates. A large number of examples, accompanied by complete C++ and Java code wherever possible, cover every topic. Features and benefits: - Comprehensive coverage of the theory with many examples - Topics in classical computing include boolean algebra, gates, circuits, latches, error detection and correction, neural networks, Turing machines, cryptography, genetic algorithms - For the first time, genetic expression programming is presented in a textbook - Topics in quantum computing include mathematical foundations, quantum algorithms, quantum information theory, hardware used in quantum computing This book serves as a textbook for courses in scientific computing and is also very suitable for self-study. Students, professionals and practitioners in computer...

  5. Evolutionary Computational Methods for Identifying Emergent Behavior in Autonomous Systems

    Science.gov (United States)

    Terrile, Richard J.; Guillaume, Alexandre

    2011-01-01

    A technique based on Evolutionary Computational Methods (ECMs) was developed that allows for the automated optimization of complex computationally modeled systems, such as autonomous systems. The primary technology, which enables the ECM to find optimal solutions in complex search spaces, derives from evolutionary algorithms such as the genetic algorithm and differential evolution. These methods are based on biological processes, particularly genetics, and define an iterative process that evolves parameter sets into an optimum. Evolutionary computation is a method that operates on a population of existing computational-based engineering models (or simulators) and competes them using biologically inspired genetic operators on large parallel cluster computers. The result is the ability to automatically find design optimizations and trades, and thereby greatly amplify the role of the system engineer.

  6. Computational reuse optimisation for stadium design

    NARCIS (Netherlands)

    Van der Steen, J.; Coenders, J.L.; Pasterkamp, S.; Rolvink, A.; Van Steekelenburg, J.

    2015-01-01

    This paper presents a proof of concept study into a computational strategy for reusing structural stadium elements. The strategy goal is overcoming the reuse design strain through implementation of a genetic algorithm. This algorithm is calibrated to search for a structural frame configuration, whil

  7. Genetic Susceptibility to Atherosclerosis

    Directory of Open Access Journals (Sweden)

    Sanja Kovacic

    2012-01-01

    Full Text Available Atherosclerosis is a complex multifocal arterial disease involving interactions of multiple genetic and environmental factors. Advances in techniques of molecular genetics have revealed that genetic ground significantly influences susceptibility to atherosclerotic vascular diseases. Besides further investigations of monogenetic diseases, candidate genes, genetic polymorphisms, and susceptibility loci associated with atherosclerotic diseases have been identified in recent years, and their number is rapidly increasing. This paper discusses main genetic investigations fields associated with human atherosclerotic vascular diseases. The paper concludes with a discussion of the directions and implications of future genetic research in arteriosclerosis with an emphasis on prospective prediction from an early age of individuals who are predisposed to develop premature atherosclerosis as well as to facilitate the discovery of novel drug targets.

  8. Genetic Pathways to Insomnia

    Directory of Open Access Journals (Sweden)

    Mackenzie J. Lind

    2016-12-01

    Full Text Available This review summarizes current research on the genetics of insomnia, as genetic contributions are thought to be important for insomnia etiology. We begin by providing an overview of genetic methods (both quantitative and measured gene, followed by a discussion of the insomnia genetics literature with regard to each of the following common methodologies: twin and family studies, candidate gene studies, and genome-wide association studies (GWAS. Next, we summarize the most recent gene identification efforts (primarily GWAS results and propose several potential mechanisms through which identified genes may contribute to the disorder. Finally, we discuss new genetic approaches and how these may prove useful for insomnia, proposing an agenda for future insomnia genetics research.

  9. Dynamic Change of Genetic Diversity in Conserved Populations with Different Initial Genetic Architectures

    Institute of Scientific and Technical Information of China (English)

    LU Yun-feng; LI Hong-wei; WU Ke-liang; WU Chang-xin

    2013-01-01

    Maintenance and management of genetic diversity of farm animal genetic resources (AnGR) is very important for biological, socioeconomical and cultural significance. The core concern of conservation for farm AnGR is the retention of genetic diversity of conserved populations in a long-term perspective. However, numerous factors may affect evolution of genetic diversity of a conserved population. Among those factors, the genetic architecture of conserved populations is little considered in current conservation strategies. In this study, we investigated the dynamic changes of genetic diversity of conserved populations with two scenarios on initial genetic architectures by computer simulation in which thirty polymorphic microsatellite loci were chosen to represent genetic architecture of the populations with observed heterozygosity (Ho) and expected heterozygosity (He), observed and mean effective number of alleles (Ao and Ae), number of polymorphic loci (NP) and the percentage of polymorphic loci (PP), number of rare alleles (RA) and number of non-rich polymorphic loci (NRP) as the estimates of genetic diversity. The two scenarios on genetic architecture were taken into account, namely, one conserved population with same allele frequency (AS) and another one with actual allele frequency (AA). The results showed that the magnitude of loss of genetic diversity is associated with genetic architecture of initial conserved population, the amplitude of genetic diversity decline in the context AS was more narrow extent than those in context AA, the ranges of decline of Ho and Ao were about 4 and 2 times in AA compared with that in AS, respectively, the occurrence of first monomorphic locus and the time of change of measure NP in scenario AA is 20 generations and 23 generations earlier than that in scenario AS, respectively. Additionally, we found that NRP, a novel measure proposed by our research group, was a proper estimate for monitoring the evolution of genetic diversity

  10. PCR in forensic genetics

    DEFF Research Database (Denmark)

    Morling, Niels

    2009-01-01

    Since the introduction in the mid-1980s of analyses of minisatellites for DNA analyses, a revolution has taken place in forensic genetics. The subsequent invention of the PCR made it possible to develop forensic genetics tools that allow both very informative routine investigations and still more...... and more advanced, special investigations in cases concerning crime, paternity, relationship, disaster victim identification etc. The present review gives an update on the use of DNA investigations in forensic genetics....

  11. Genetics of stroke

    OpenAIRE

    Guo, Jin-Min; Liu, Ai-Jun; Su, Ding-Feng

    2010-01-01

    Stroke is the second most common cause of death and the most common cause of disability in developed countries. Stroke is a multi-factorial disease caused by a combination of environmental and genetic factors. Numerous epidemiologic studies have documented a significant genetic component in the occurrence of strokes. Genes encoding products involved in lipid metabolism, thrombosis, and inflammation are believed to be potential genetic factors for stroke. Although a large group of candidate ge...

  12. Genetics of mental retardation

    OpenAIRE

    Ahuja A; Thapar Anita; Owen M

    2005-01-01

    Mental retardation can follow any of the biological, environmental and psychological events that are capable of producing deficits in cognitive functions. Recent advances in molecular genetic techniques have enabled us to understand more about the molecular basis of several genetic syndromes associated with mental retardation. In contrast, where there is no discrete cause, the interplay of genetic and environmental influences remains poorly understood. This article presents a critical review ...

  13. Genetics of complex diseases

    DEFF Research Database (Denmark)

    Mellerup, Erling; Møller, Gert Lykke; Koefoed, Pernille

    2012-01-01

    A complex disease with an inheritable component is polygenic, meaning that several different changes in DNA are the genetic basis for the disease. Such a disease may also be genetically heterogeneous, meaning that independent changes in DNA, i.e. various genotypes, can be the genetic basis...... for the disease. Each of these genotypes may be characterized by specific combinations of key genetic changes. It is suggested that even if all key changes are found in genes related to the biology of a certain disease, the number of combinations may be so large that the number of different genotypes may be close...

  14. Genetics of nonsyndromic obesity.

    Science.gov (United States)

    Lee, Yung Seng

    2013-12-01

    Common obesity is widely regarded as a complex, multifactorial trait influenced by the 'obesogenic' environment, sedentary behavior, and genetic susceptibility contributed by common and rare genetic variants. This review describes the recent advances in understanding the role of genetics in obesity. New susceptibility loci and genetic variants are being uncovered, but the collective effect is relatively small and could not explain most of the BMI heritability. Yet-to-be identified common and rare variants, epistasis, and heritable epigenetic changes may account for part of the 'missing heritability'. Evidence is emerging about the role of epigenetics in determining obesity susceptibility, mediating developmental plasticity, which confers obesity risk from early life experiences. Genetic prediction scores derived from selected genetic variants, and also differential DNA methylation levels and methylation scores, have been shown to correlate with measures of obesity and response to weight loss intervention. Genetic variants, which confer susceptibility to obesity-related morbidities like nonalcoholic fatty liver disease, were also discovered recently. We can expect discovery of more rare genetic variants with the advent of whole exome and genome sequencing, and also greater understanding of epigenetic mechanisms by which environment influences genetic expression and which mediate the gene-environment interaction.

  15. Genetics Home Reference: abetalipoproteinemia

    Science.gov (United States)

    ... Betalipoprotein Deficiency Disease Congenital betalipoprotein deficiency syndrome Microsomal Triglyceride Transfer Protein Deficiency Disease Related Information How are genetic conditions and genes ...

  16. Experimental DNA computing

    NARCIS (Netherlands)

    Henkel, Christiaan

    2005-01-01

    Because of their information storing and processing capabilities, nucleic acids are interesting building blocks for molecular scale computers. Potential applications of such DNA computers range from massively parallel computation to computational gene therapy. In this thesis, several implementations

  17. Genetic Programming and Genetic Algorithms for Propositions

    Directory of Open Access Journals (Sweden)

    Nabil M. HEWAHI

    2012-01-01

    Full Text Available In this paper we propose a mechanism to discover the compound proposition solutions for a given truth table without knowing the compound propositions that lead to the truth table results. The approach is based on two proposed algorithms, the first is called Producing Formula (PF algorithm which is based on the genetic programming idea, to find out the compound proposition solutions for the given truth table. The second algorithm is called the Solutions Optimization (SO algorithm which is based on genetic algorithms idea, to find a list of the optimum compound propositions that can solve the truth table. The obtained list will depend on the solutions obtained from the PF algorithm. Various types of genetic operators have been introduced to obtain the solutions either within the PF algorithm or SO algorithm.

  18. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  19. Study of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Prashant Anil Patil

    2012-04-01

    Full Text Available This paper gives the detailed information about Quantum computer, and difference between quantum computer and traditional computers, the basis of Quantum computers which are slightly similar but still different from traditional computer. Many research groups are working towards the highly technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. Quantum computer is very much use full for computation purpose in field of Science and Research. Large amount of data and information will be computed, processing, storing, retrieving, transmitting and displaying information in less time with that much of accuracy which is not provided by traditional computers.

  20. Milestones in beef cattle genetic evaluation.

    Science.gov (United States)

    Golden, B L; Garrick, D J; Benyshek, L L

    2009-04-01

    National beef cattle genetic evaluation programs have evolved in the United States over the last 35 yr to create important tools that are part of sustainable breeding programs. The history of national beef cattle genetic evaluation programs has lessons to offer the next generation of researchers as new approaches in molecular genetics and decision support are developed. Through a series of complex and intricate pressures from technology and organizational challenges, national cattle evaluation programs continue to grow in importance and impact. Development of enabling technologies and the interface of the disciplines of computer science, numerical methods, statistics, and quantitative genetics have created an example of how academics, government, and industry can work together to create more effective solutions to technical problems. The advent of mixed model procedures was complemented by a series of breakthrough discoveries that made what was previously considered intractable a reality. The creation of modern genetic evaluation procedures has followed a path characterized by a steady and constant approach to identification and solution for each technical problem encountered. At its core, the driving force for the evolution has been the need to constantly improve the accuracy of the predictions of genetic merit for breeding stock, especially young animals. Sensible approaches, such as the principle of economically relevant traits, were developed that created the rules to be followed as the programs grew. However, the current systems are far from complete or perfect. Modern genetic evaluation programs have a long way to go, and a great deal of improvement in the accuracy of prediction is still possible. But the greatest challenge remains: the need to understand that genetic predictions are only parameters for decision support procedures and not an end in themselves.

  1. Judaism, genetic screening and genetic therapy.

    Science.gov (United States)

    Rosner, F

    1998-01-01

    Genetic screening, gene therapy and other applications of genetic engineering are permissible in Judaism when used for the treatment, cure, or prevention of disease. Such genetic manipulation is not considered to be a violation of God's natural law, but a legitimate implementation of the biblical mandate to heal. If Tay-Sachs disease, diabetes, hemophilia, cystic fibrosis, Huntington's disease or other genetic diseases can be cured or prevented by "gene surgery," then it is certainly permitted in Jewish law. Genetic premarital screening is encouraged in Judaism for the purpose of discouraging at-risk marriages for a fatal illness such as Tay-Sachs disease. Neonatal screening for treatable conditions such as phenylketonuria is certainly desirable and perhaps required in Jewish law. Preimplantation screening and the implantation of only "healthy" zygotes into the mother's womb to prevent the birth of an affected child are probably sanctioned in Jewish law. Whether or not these assisted reproduction techniques may be used to choose the sex of one's offspring, to prevent the birth of a child with a sex-linked disease such as hemophilia, has not yet been ruled on by modern rabbinic decisions. Prenatal screening with the specific intent of aborting an affected fetus is not allowed according to most rabbinic authorities, although a minority view permits it "for great need." Not to have children if both parents are carriers of genetic diseases such as Tay-Sachs is not a Jewish option. Preimplantation screening is preferable. All screening test results must remain confidential. Judaism does not permit the alteration or manipulation of physical traits and characteristics such as height, eye and hair color, facial features and the like, when such change provides no useful benefit to mankind. On the other hand, it is permissible to clone organisms and microorganisms to facilitate the production of insulin, growth hormone, and other agents intended to benefit mankind and to

  2. A fielded wiki for personality genetics

    DEFF Research Database (Denmark)

    Nielsen, Finn Årup

    2010-01-01

    I describe a fielded wiki, where a Web form interface allows the entry, analysis and visualization of results from scientific papers in the personality genetics domain. Papers in this domain typically report the mean and standard deviation of multiple personality trait scores from statistics on h......-analysis can compute individual and combined effect sizes. The meta-analytic results are displayed in on-the-fly computed hyperlinked graphs and tables. Revision control on the basic data tracks changes and data may be exported to comma-separated files or in a MediaWiki template format....

  3. Next generation quantitative genetics in plants.

    Science.gov (United States)

    Jiménez-Gómez, José M

    2011-01-01

    Most characteristics in living organisms show continuous variation, which suggests that they are controlled by multiple genes. Quantitative trait loci (QTL) analysis can identify the genes underlying continuous traits by establishing associations between genetic markers and observed phenotypic variation in a segregating population. The new high-throughput sequencing (HTS) technologies greatly facilitate QTL analysis by providing genetic markers at genome-wide resolution in any species without previous knowledge of its genome. In addition HTS serves to quantify molecular phenotypes, which aids to identify the loci responsible for QTLs and to understand the mechanisms underlying diversity. The constant improvements in price, experimental protocols, computational pipelines, and statistical frameworks are making feasible the use of HTS for any research group interested in quantitative genetics. In this review I discuss the application of HTS for molecular marker discovery, population genotyping, and expression profiling in QTL analysis.

  4. Genetic Algorithms for multiple objective vehicle routing

    CERN Document Server

    Geiger, Martin Josef

    2008-01-01

    The talk describes a general approach of a genetic algorithm for multiple objective optimization problems. A particular dominance relation between the individuals of the population is used to define a fitness operator, enabling the genetic algorithm to adress even problems with efficient, but convex-dominated alternatives. The algorithm is implemented in a multilingual computer program, solving vehicle routing problems with time windows under multiple objectives. The graphical user interface of the program shows the progress of the genetic algorithm and the main parameters of the approach can be easily modified. In addition to that, the program provides powerful decision support to the decision maker. The software has proved it's excellence at the finals of the European Academic Software Award EASA, held at the Keble college/ University of Oxford/ Great Britain.

  5. Approximate Bayesian computation with functional statistics.

    Science.gov (United States)

    Soubeyrand, Samuel; Carpentier, Florence; Guiton, François; Klein, Etienne K

    2013-03-26

    Functional statistics are commonly used to characterize spatial patterns in general and spatial genetic structures in population genetics in particular. Such functional statistics also enable the estimation of parameters of spatially explicit (and genetic) models. Recently, Approximate Bayesian Computation (ABC) has been proposed to estimate model parameters from functional statistics. However, applying ABC with functional statistics may be cumbersome because of the high dimension of the set of statistics and the dependences among them. To tackle this difficulty, we propose an ABC procedure which relies on an optimized weighted distance between observed and simulated functional statistics. We applied this procedure to a simple step model, a spatial point process characterized by its pair correlation function and a pollen dispersal model characterized by genetic differentiation as a function of distance. These applications showed how the optimized weighted distance improved estimation accuracy. In the discussion, we consider the application of the proposed ABC procedure to functional statistics characterizing non-spatial processes.

  6. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  7. Towards synthetic cells:are cells computers making computers?

    Institute of Scientific and Technical Information of China (English)

    Antoine Dan-chin

    2009-01-01

    Understanding life supposes that one will,one day,reconstruct cells. A deep analysis of what life is shows that a cell is similar to a computers making computer. This asks for several orginal levels of organisation. First,the cell needs to be seen as a machine separated from the genetic program,which it runs. Over generations the machine reproduces,while the program replicates. Reproduction is a process which is able to accumulate valuable information over generations. Extracting valuable information from an ocean of noise requires an energy-dependent process which uses energy to prevent degradation of functional entities. Analysis of bacterial genomes shows that the core set of genes which persist in most genomes code for the functions needed to perform this process of ratchet-like information accumulation. It also suggests that a mineral,polyphosphates,could be a ubiquitous (and stable) energy source essential for the process.

  8. Computing with functionals—computability theory or computer science?

    OpenAIRE

    Normann, Dag

    2006-01-01

    We review some of the history of the computability theory of functionals of higher types, and we will demonstrate how contributions from logic and theoretical computer science have shaped this still active subject.

  9. Chapter 10: Mining genome-wide genetic markers.

    Directory of Open Access Journals (Sweden)

    Xiang Zhang

    Full Text Available Genome-wide association study (GWAS aims to discover genetic factors underlying phenotypic traits. The large number of genetic factors poses both computational and statistical challenges. Various computational approaches have been developed for large scale GWAS. In this chapter, we will discuss several widely used computational approaches in GWAS. The following topics will be covered: (1 An introduction to the background of GWAS. (2 The existing computational approaches that are widely used in GWAS. This will cover single-locus, epistasis detection, and machine learning methods that have been recently developed in biology, statistic, and computer science communities. This part will be the main focus of this chapter. (3 The limitations of current approaches and future directions.

  10. Genetics in the courts

    Energy Technology Data Exchange (ETDEWEB)

    Coyle, Heather; Drell, Dan

    2000-12-01

    Various: (1)TriState 2000 Genetics in the Courts (2) Growing impact of the new genetics on the courts (3)Human testing (4) Legal analysis - in re G.C. (5) Legal analysis - GM ''peanots'', and (6) Legal analysis for State vs Miller

  11. Genetics and Developmental Psychology

    Science.gov (United States)

    Plomin, Robert

    2004-01-01

    One of the major changes in developmental psychology during the past 50 years has been the acceptance of the important role of nature (genetics) as well as nurture (environment). Past research consisting of twin and adoption studies has shown that genetic influence is substantial for most domains of developmental psychology. Present research…

  12. Quo Vadis, Medical Genetics?

    Science.gov (United States)

    Czeizel, Andrew E.

    The beginning of human genetics and its medical part: medical genetics was promising in the early decades of this century. Many genetic diseases and defects with Mendelian origin were identified and it helped families with significant genetic burden to limit their child number. Unfortunately this good start was shadowed by two tragic events. On the one hand, in the 1930s and early 1940s the German fascism brought about the dominance of an unscientific eugenics to mask vile political crimes. People with genetic diseases-defects were forced to sterilisation and several of them were killed. On the other hand, in the 1950s lysenkoism inhibitied the evolution of genetics in the Soviet Union and their satelite countries. Lysenko's doctrine declared genetics as a product of imperialism and a guilty science, therefore leading geneticists were ousted form their posts and some of them were executed or put in prison. Past decades genetics has resulted fantastic new results and achieved a leading position within the natural sciences. To my mind, however, the expected wider use of new eugenics indicates a new tragedy and this Cassandra's prediction is the topic of this presentation.

  13. Genetics and Developmental Psychology

    Science.gov (United States)

    Plomin, Robert

    2004-01-01

    One of the major changes in developmental psychology during the past 50 years has been the acceptance of the important role of nature (genetics) as well as nurture (environment). Past research consisting of twin and adoption studies has shown that genetic influence is substantial for most domains of developmental psychology. Present research…

  14. Ethical issues in genetics.

    Science.gov (United States)

    Shannon, T A

    1999-03-01

    The first section of the Notes on Moral Theology reviews ethical issues in genetics through the lenses of privacy-confidentiality; risk-benefit analysis in relation to prenatal diagnosis and gene therapy; and freedom-determinism/human dignity in the context of cloning. The author provides an overview of developments in genetics and highlights thematic issues common to these developments.

  15. Cryptic Genetic Variation in Evolutionary Developmental Genetics

    Directory of Open Access Journals (Sweden)

    Annalise B. Paaby

    2016-06-01

    Full Text Available Evolutionary developmental genetics has traditionally been conducted by two groups: Molecular evolutionists who emphasize divergence between species or higher taxa, and quantitative geneticists who study variation within species. Neither approach really comes to grips with the complexities of evolutionary transitions, particularly in light of the realization from genome-wide association studies that most complex traits fit an infinitesimal architecture, being influenced by thousands of loci. This paper discusses robustness, plasticity and lability, phenomena that we argue potentiate major evolutionary changes and provide a bridge between the conceptual treatments of macro- and micro-evolution. We offer cryptic genetic variation and conditional neutrality as mechanisms by which standing genetic variation can lead to developmental system drift and, sheltered within canalized processes, may facilitate developmental transitions and the evolution of novelty. Synthesis of the two dominant perspectives will require recognition that adaptation, divergence, drift and stability all depend on similar underlying quantitative genetic processes—processes that cannot be fully observed in continuously varying visible traits.

  16. Cryptic Genetic Variation in Evolutionary Developmental Genetics.

    Science.gov (United States)

    Paaby, Annalise B; Gibson, Greg

    2016-06-13

    Evolutionary developmental genetics has traditionally been conducted by two groups: Molecular evolutionists who emphasize divergence between species or higher taxa, and quantitative geneticists who study variation within species. Neither approach really comes to grips with the complexities of evolutionary transitions, particularly in light of the realization from genome-wide association studies that most complex traits fit an infinitesimal architecture, being influenced by thousands of loci. This paper discusses robustness, plasticity and lability, phenomena that we argue potentiate major evolutionary changes and provide a bridge between the conceptual treatments of macro- and micro-evolution. We offer cryptic genetic variation and conditional neutrality as mechanisms by which standing genetic variation can lead to developmental system drift and, sheltered within canalized processes, may facilitate developmental transitions and the evolution of novelty. Synthesis of the two dominant perspectives will require recognition that adaptation, divergence, drift and stability all depend on similar underlying quantitative genetic processes-processes that cannot be fully observed in continuously varying visible traits.

  17. Using a Computer Animation to Teach High School Molecular Biology

    Science.gov (United States)

    Rotbain, Yosi; Marbach-Ad, Gili; Stavy, Ruth

    2008-01-01

    We present an active way to use a computer animation in secondary molecular genetics class. For this purpose we developed an activity booklet that helps students to work interactively with a computer animation which deals with abstract concepts and processes in molecular biology. The achievements of the experimental group were compared with those…

  18. GENETIC-BASED NUTRITION RECOMMENDATION MODEL

    Directory of Open Access Journals (Sweden)

    S. A.A. Fayoumi

    2014-01-01

    Full Text Available Evolutionary computing is the collective name for a range of problem-solving techniques based on principles of biological evolution, such as natural selection and genetic inheritance. These techniques are being widely applied to a variety of problems in many vital fields. Also, Evolutionary Algorithms (EA which applied the principles of Evolutionary computations, such as genetic algorithm, particle swarm, ant colony and bees algorithm and so on play an important role in decision making process. EAs serve a lot of fields which can affect our life directly, such as medicine, engineering, transportations, communications. One of these vital fields is Nutrition which can be viewed from several points of view as medical, physical, social, environmental and psychological point of view. This study, presents a proposed model that shows how evolutionary computing generally and genetic algorithm specifically-as a powerful algorithm of evolutionary algorithms-can be used to recommend an appropriate nutrition style in a medical and physical sides only to each person according to his/her personal and medical measurements.

  19. THE MEANING OF GENETICS

    Directory of Open Access Journals (Sweden)

    Svenja Adolphs

    2003-05-01

    Full Text Available Research into the public understanding of genetics has greatly expanded lately. At the same time inatters relating to biotechnology have scizcd the public's attention. Corpus linguistics has long asked questions about how meaning is created and changed in the public sphere through language use. However, linking Corpus linguistics to the study of the public understanding of science is something too few have done. To correct this trend, we apply methods from corpus linguistics and cognitive linguistics to study how people talk about genetics. We do so by analysiny the mieaning of words like gene, genes, genetic, genetics, and genetically as found in various spoken and written corpora. Specifically, we examine how they take on certain (e.g. figurative connotations and modulate in context.

  20. ADHD and genetic syndromes.

    Science.gov (United States)

    Lo-Castro, Adriana; D'Agati, Elisa; Curatolo, Paolo

    2011-06-01

    A high rate of Attention Deficit/Hyperactivity Disorder (ADHD)-like characteristics has been reported in a wide variety of disorders including syndromes with known genetic causes. In this article, we review the genetic and the neurobiological links between ADHD symptoms and some genetic syndromes such as: Fragile X Syndrome, Neurofibromatosis 1, DiGeorge Syndrome, Tuberous Sclerosis Complex, Turner Syndrome, Williams Syndrome and Klinefelter Syndrome. Although each syndrome may arise from different genetic abnormalities with multiple molecular functions, the effects of these abnormalities may give rise to common effects downstream in the biological pathways or neural circuits, resulting in the presentation of ADHD symptoms. Early diagnosis of ADHD allows for earlier treatment, and has the potential for a better outcome in children with genetic syndromes.

  1. Genetics of hepatocellular carcinoma

    Institute of Scientific and Technical Information of China (English)

    Andreas Teufel; Frank Staib; Stephan Kanzler; Arndt Weinmann; Henning Schulze-Bergkamen; Peter R Galle

    2007-01-01

    The completely assembled human genome has made it possible for modern medicine to step into an era rich in genetic information and high-throughput genomic analysis. These novel and readily available genetic resources and analytical tools may be the key to unravel the molecular basis of hepatocellular carcinoma (HCC). Moreover, since an efficient treatment for this disease is lacking, further understanding of the genetic background of HCC will be crucial in order to develop new therapies aimed at selected targets. We report on the current status and recent developments in HCC genetics. Special emphasis is given to the genetics and regulation of major signalling pathways involved in HCC such as p53, Wntsignalling, TGFβ, Ras, and Rb pathways. Furthermore, we describe the influence of chromosomal aberrations as well as of DNA methylation. Finally, we report on the rapidly developing field of genomic expression profiling in HCC, mainly by microarray analysis.

  2. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...

  3. Frequently Asked Questions about Genetic Testing

    Science.gov (United States)

    ... Care Specific Genetic Disorders Frequently Asked Questions About Genetic Testing What is genetic testing? What can I learn ... find more information about genetic testing? What is genetic testing? Genetic testing uses laboratory methods to look at ...

  4. Genetics Home Reference: genetic epilepsy with febrile seizures plus

    Science.gov (United States)

    ... Health Conditions genetic epilepsy with febrile seizures plus genetic epilepsy with febrile seizures plus Printable PDF Open ... Javascript to view the expand/collapse boxes. Description Genetic epilepsy with febrile seizures plus (GEFS+) is a ...

  5. Filter selection using genetic algorithms

    Science.gov (United States)

    Patel, Devesh

    1996-03-01

    Convolution operators act as matched filters for certain types of variations found in images and have been extensively used in the analysis of images. However, filtering through a bank of N filters generates N filtered images, consequently increasing the amount of data considerably. Moreover, not all these filters have the same discriminatory capabilities for the individual images, thus making the task of any classifier difficult. In this paper, we use genetic algorithms to select a subset of relevant filters. Genetic algorithms represent a class of adaptive search techniques where the processes are similar to natural selection of biological evolution. The steady state model (GENITOR) has been used in this paper. The reduction of filters improves the performance of the classifier (which in this paper is the multi-layer perceptron neural network) and furthermore reduces the computational requirement. In this study we use the Laws filters which were proposed for the analysis of texture images. Our aim is to recognize the different textures on the images using the reduced filter set.

  6. Program Facilitates Distributed Computing

    Science.gov (United States)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  7. Algoritmos genéticos e computação paralela para problemas de roteirização de veículos com janelas de tempo e entregas fracionadas Genetic algorithms and parallel computing for a vehicle routing problem with time windows and split deliveries

    Directory of Open Access Journals (Sweden)

    Guilherme Guidolin de Campos

    2006-05-01

    parallel genetic algorithms supported by a cluster of computers. The results indicate that the basic constructive heuristic provides satisfactory results for the problem, but that it can be improved through the use of more sophisticated techniques. The use of the parallel genetic algorithm with multiple populations and an initial solution, which presented the best results, reduced the total operational costs by about 10% compared with the constructive heuristic, and by 13% when compared with the company's original solutions.

  8. 线虫中影响化学传感器结构与功能的遗传位点及其可能对应的microRNA的生物信息学分析%Computational analysis of genetic loci required for amphid structure and functions and their possibly corresponding microRNAs in C. elegans

    Institute of Scientific and Technical Information of China (English)

    胡亚欧; 孙阳; 叶波平; 王大勇

    2007-01-01

    Objective To examine the important roles of microRNAs (miRNAs) in regulating amphid structure and function,we performed a computational analysis for the genetic loci required for the sensory perception and their possibly Corresponding miRNAs in C. elegans. Methods Total 55 genetic loci required for the amphid structure and function were selected. Sequence alignment was combined with E value evaluation to investigate and identify the possible corresponding miRNAs. Results Total 30 genes among the 55 genetic loci selected have their possible corresponding regulatory miRNA(s), and identified genes participate in the regulation of almost all aspects of amphid structure and function. In addition, our data suggest that both the amphid structure and the amphid functions might be regulated by a series of network signaling pathways. Moreover, the distribution of miRNAs along the 3' untranslated region (UTR) of these 30 genes exhibits different patterns. Conclusion We present the possible miRNA-mediated signaling pathways involved in the regulation of chemosensation and thermosensation by controlling the corresponding sensory neuron and interneuron functions. Our work will be useful for better understanding of the miRNA-mediated control of the chemotaxis and thermotaxis in C. elegans.%目的 为了系统评价microRNA(miRNA)在调控化学传感器结构和功能方面的重要作用,我们进行了线虫中有关影响化学传感器结构与功能的遗传位点以及和其可能对应的miRNA的生物信息学分析.方法 我们选择了55个与化学传感器结构与功能相关的遗传位点,并且进一步通过评价序列比对后的E值等手段来分析和确认其可能对应的miRNA.结果 在55个遗传位点中,我们发现其中30个遗传位点存在与其可能对应的miRNA.这些所鉴定的基因在功能上几乎涵盖了化学传感器结构和功能调控的诸多方面.而且,我们的数据还表明化学传感器结构和功能可能受到一系

  9. Genetic Algorithm-Based Relevance Feedback for Image Retrieval Using Local Similarity Patterns.

    Science.gov (United States)

    Stejic, Zoran; Takama, Yasufumi; Hirota, Kaoru

    2003-01-01

    Proposes local similarity pattern (LSP) as a new method for computing digital image similarity. Topics include optimizing similarity computation based on genetic algorithm; relevance feedback; and an evaluation of LSP on five databases that showed an increase in retrieval precision over other methods for computing image similarity. (Author/LRW)

  10. gPGA: GPU Accelerated Population Genetics Analyses.

    Directory of Open Access Journals (Sweden)

    Chunbao Zhou

    Full Text Available The isolation with migration (IM model is important for studies in population genetics and phylogeography. IM program applies the IM model to genetic data drawn from a pair of closely related populations or species based on Markov chain Monte Carlo (MCMC simulations of gene genealogies. But computational burden of IM program has placed limits on its application.With strong computational power, Graphics Processing Unit (GPU has been widely used in many fields. In this article, we present an effective implementation of IM program on one GPU based on Compute Unified Device Architecture (CUDA, which we call gPGA.Compared with IM program, gPGA can achieve up to 52.30X speedup on one GPU. The evaluation results demonstrate that it allows datasets to be analyzed effectively and rapidly for research on divergence population genetics. The software is freely available with source code at https://github.com/chunbaozhou/gPGA.

  11. Resizing Technique-Based Hybrid Genetic Algorithm for Optimal Drift Design of Multistory Steel Frame Buildings

    Directory of Open Access Journals (Sweden)

    Hyo Seon Park

    2014-01-01

    Full Text Available Since genetic algorithm-based optimization methods are computationally expensive for practical use in the field of structural optimization, a resizing technique-based hybrid genetic algorithm for the drift design of multistory steel frame buildings is proposed to increase the convergence speed of genetic algorithms. To reduce the number of structural analyses required for the convergence, a genetic algorithm is combined with a resizing technique that is an efficient optimal technique to control the drift of buildings without the repetitive structural analysis. The resizing technique-based hybrid genetic algorithm proposed in this paper is applied to the minimum weight design of three steel frame buildings. To evaluate the performance of the algorithm, optimum weights, computational times, and generation numbers from the proposed algorithm are compared with those from a genetic algorithm. Based on the comparisons, it is concluded that the hybrid genetic algorithm shows clear improvements in convergence properties.

  12. All about Genetics (For Parents)

    Science.gov (United States)

    ... or sequence) of these four bases determines each genetic code. The segments of DNA that contain the instructions ... laboratory dyes. continue Genetic Problems Errors in the genetic code or "gene recipe" can happen in a variety ...

  13. Genetics Home Reference: bipolar disorder

    Science.gov (United States)

    ... Conditions Genes Chromosomes & mtDNA Resources Help Me Understand Genetics Share: Email Facebook Twitter Home Health Conditions bipolar ... my family? What is the prognosis of a genetic condition? Genetic and Rare Diseases Information Center Frequency ...

  14. Genetics Home Reference: vibratory urticaria

    Science.gov (United States)

    ... in allergy symptoms such as hives (urticaria), swelling (angioedema), redness (erythema), and itching (pruritus) in the affected ... Genetic Testing (2 links) Genetic Testing Registry: Vibratory angioedema Genetic Testing Registry: Vibratory urticaria General Information from ...

  15. Computation Through Neuronal Oscillations

    Science.gov (United States)

    Hepp, K.

    Some of us believe that natural sciences are governed by simple and predictive general principles. This hope has not yet been fulfilled in physics for unifying gravitation and quantum mechanics. Epigenetics has shaken the monopoly of the genetic code to determine inheritance (Alberts et al., Molecular Biology of the Cell. Garland, New York, 2008). It is therefore not surprising that quantum mechanics does not explain consciousness or more generally the coherence of the brain in perception, action and cognition. In an other context, others (Tegmark, Phys Rev E 61:4194-4206, 2000) and we (Koch and Hepp, Nature 440:611-612, 2006; Koch and Hepp, Visions of Discovery: New Light on Physics, Cosmology, and Consciousness. Cambridge University Press, Cambridge, 2011) have strongly argued against the absurdity of such a claim, because consciousness is a higher brain function and not a molecular binding mechanism. Decoherence in the warm and wet brain is by many orders of magnitude too strong. Moreover, there are no efficient algorithms for neural quantum computations. However, the controversy over classical and quantum consciousness will probably never be resolved (see e.g. Hepp, J Math Phys 53:095222, 2012; Hameroff and Penrose, Phys Life Rev 11:39-78, 2013).

  16. Genetics Home Reference: polycystic kidney disease

    Science.gov (United States)

    ... links) Genetic Testing Registry: Autosomal recessive polycystic kidney disease Genetic Testing Registry: Polycystic kidney disease 2 Genetic Testing Registry: Polycystic kidney disease 3 Genetic Testing ...

  17. Genetic-Algorithm Tool For Search And Optimization

    Science.gov (United States)

    Wang, Lui; Bayer, Steven

    1995-01-01

    SPLICER computer program used to solve search and optimization problems. Genetic algorithms adaptive search procedures (i.e., problem-solving methods) based loosely on processes of natural selection and Darwinian "survival of fittest." Algorithms apply genetically inspired operators to populations of potential solutions in iterative fashion, creating new populations while searching for optimal or nearly optimal solution to problem at hand. Written in Think C.

  18. Genome-Wide Prediction of C. elegans Genetic Interactions

    OpenAIRE

    Zhong, Weiwei; Sternberg, Paul W.

    2006-01-01

    To obtain a global view of functional interactions among genes in a metazoan genome, we computationally integrated interactome data, gene expression data, phenotype data, and functional annotation data from three model organisms—Saccharomyces cerevisiae, Caenorhabditis elegans, and Drosophila melanogaster—and predicted genome-wide genetic interactions in C. elegans. The resulting genetic interaction network (consisting of 18,183 interactions) provides a framework for system-level understandin...

  19. Solving Hitchcock's transportation problem by a genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    CHEN Hai-feng; CHO Joong Rae; LEE Jeong.Tae

    2004-01-01

    Genetic algorithms (GAs) employ the evolutionary process of Darwin's nature selection theory to find the solutions of optimization problems. In this paper, an implementation of genetic algorithm is put forward to solve a classical transportation problem, namely the Hitchcock's Transportation Problem (HTP), and the GA is improved to search for all optimal solutions and identify them automatically. The algorithm is coded with C++ and validated by numerical examples. The computational results show that the algorithm is efficient for solving the Hitchcock's transportation problem.

  20. Pass-ball trainning based on genetic reinforcement learning

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Introduces a mixture genetic algorithm and reinforcement learning computation model used for inde pendent agent learning in continuous, distributive, open environment, which takes full advantage of the reactive and robust of reinforcement learning algorithm and the property that genetic algorithm is suitable to the problem with high dimension, large collectivity, complex environment, and concludes that through proper training, the result verifies that this method is available in the complex multi-agent environment.

  1. Optimal Design of Materials for DJMP Based on Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    FENG Zhong-ren; WANG Xiong-jiang

    2004-01-01

    The genetic algorithm was used in optimal design of deep jet method pile. The cost of deep jetmethod pile in one unit area of foundation was taken as the objective function. All the restrains were listed followingthe corresponding specification. Suggestions were proposed and the modified. The real-coded Genetic Algorithm wasgiven to deal with the problems of excessive computational cost and premature convergence. Software system of opti-mal design of deep jet method pile was developed.

  2. Modeling evolution of insect resistance to genetically modified crops

    OpenAIRE

    2015-01-01

    Genetically modified crops producing insecticidal proteins from Bacillus thuringiensis (Bt) for insect control have been planted on more than 200 million ha worldwide since 1996 [1]. Evolution of resistance by insect pests threatens the continued success of Bt crops [2, 3]. To delay pest resistance, refuges of non-Bt crops are planted near Bt crops to allow survival of susceptible pests [4, 5]. We used computer simulations of a population genetic model to determine if predictions from the the...

  3. A GPU-Computing Approach to Solar Stokes Profile Inversion

    CERN Document Server

    Harker, Brian J

    2012-01-01

    We present a new computational approach to the inversion of solar photospheric Stokes polarization profiles, under the Milne-Eddington model, for vector magnetography. Our code, named GENESIS (GENEtic Stokes Inversion Strategy), employs multi-threaded parallel-processing techniques to harness the computing power of graphics processing units GPUs, along with algorithms designed to exploit the inherent parallelism of the Stokes inversion problem. Using a genetic algorithm (GA) engineered specifically for use with a GPU, we produce full-disc maps of the photospheric vector magnetic field from polarized spectral line observations recorded by the Synoptic Optical Long-term Investigations of the Sun (SOLIS) Vector Spectromagnetograph (VSM) instrument. We show the advantages of pairing a population-parallel genetic algorithm with data-parallel GPU-computing techniques, and present an overview of the Stokes inversion problem, including a description of our adaptation to the GPU-computing paradigm. Full-disc vector ma...

  4. Synthetic analog and digital circuits for cellular computation and memory

    OpenAIRE

    Purcell, Oliver; Lu, Timothy K.

    2014-01-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss r...

  5. SOME PARADIGMS OF ARTIFICIAL INTELLIGENCE IN FINANCIAL COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2015-12-01

    Full Text Available The article discusses some paradigms of artificial intelligence in the context of their applications in computer financial systems. The proposed approach has a significant po-tential to increase the competitiveness of enterprises, including financial institutions. However, it requires the effective use of supercomputers, grids and cloud computing. A reference is made to the computing environment for Bitcoin. In addition, we characterized genetic programming and artificial neural networks to prepare investment strategies on the stock exchange market.

  6. On Derivations Of Genetic Algebras

    Science.gov (United States)

    Mukhamedov, Farrukh; Qaralleh, Izzat

    2014-11-01

    A genetic algebra is a (possibly non-associative) algebra used to model inheritance in genetics. In application of genetics this algebra often has a basis corresponding to genetically different gametes, and the structure constant of the algebra encode the probabilities of producing offspring of various types. In this paper, we find the connection between the genetic algebras and evolution algebras. Moreover, we prove the existence of nontrivial derivations of genetic algebras in dimension two.

  7. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  8. Genetically Engineered Cyanobacteria

    Science.gov (United States)

    Zhou, Ruanbao (Inventor); Gibbons, William (Inventor)

    2015-01-01

    The disclosed embodiments provide cyanobacteria spp. that have been genetically engineered to have increased production of carbon-based products of interest. These genetically engineered hosts efficiently convert carbon dioxide and light into carbon-based products of interest such as long chained hydrocarbons. Several constructs containing polynucleotides encoding enzymes active in the metabolic pathways of cyanobacteria are disclosed. In many instances, the cyanobacteria strains have been further genetically modified to optimize production of the carbon-based products of interest. The optimization includes both up-regulation and down-regulation of particular genes.

  9. [Genetic risk and discrimination].

    Science.gov (United States)

    Vidal Gallardo, Mercedes

    2010-01-01

    The continuous advances in our society in the last decades have allowed us to get to know the personal genetic data. Although this discovery has important benefits, it also causes a great paradox, since the genetic information can be an element of social stigma, and its inappropriate use can damage the fundamental rights. It is obvious that there are cases in which the genetic risk, that is, the predisposition of a person to suffer some illnesses, can be a discriminatory element, especially in the contractual field.

  10. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  11. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  12. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  13. Genetics Home Reference: Liddle syndrome

    Science.gov (United States)

    ... unknown. The condition has been found in populations worldwide. Related Information What information about a genetic condition can statistics provide? Why are some genetic conditions more common ...

  14. Computational thinking and thinking about computing.

    Science.gov (United States)

    Wing, Jeannette M

    2008-10-28

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  15. Predictability of Genetic Interactions from Functional Gene Modules

    Directory of Open Access Journals (Sweden)

    Jonathan H. Young

    2017-02-01

    Full Text Available Characterizing genetic interactions is crucial to understanding cellular and organismal response to gene-level perturbations. Such knowledge can inform the selection of candidate disease therapy targets, yet experimentally determining whether genes interact is technically nontrivial and time-consuming. High-fidelity prediction of different classes of genetic interactions in multiple organisms would substantially alleviate this experimental burden. Under the hypothesis that functionally related genes tend to share common genetic interaction partners, we evaluate a computational approach to predict genetic interactions in Homo sapiens, Drosophila melanogaster, and Saccharomyces cerevisiae. By leveraging knowledge of functional relationships between genes, we cross-validate predictions on known genetic interactions and observe high predictive power of multiple classes of genetic interactions in all three organisms. Additionally, our method suggests high-confidence candidate interaction pairs that can be directly experimentally tested. A web application is provided for users to query genes for predicted novel genetic interaction partners. Finally, by subsampling the known yeast genetic interaction network, we found that novel genetic interactions are predictable even when knowledge of currently known interactions is minimal.

  16. Emergence of Algorithmic Languages in Genetic Systems

    CERN Document Server

    Angeles, O; Waelbroeck, H

    1997-01-01

    In genetic systems there is a non-trivial interface between the sequence of symbols which constitutes the chromosome, or ``genotype'', and the products which this sequence encodes --- the ``phenotype''. This interface can be thought of as a ``computer''. In this case the chromosome is viewed as an algorithm and the phenotype as the result of the computation. In general only a small fraction of all possible sequences of symbols makes any sense for a given computer. The difficulty of finding meaningful algorithms by random mutation is known as the brittleness problem. In this paper we show that mutation and crossover favour the emergence of an algorithmic language which facilitates the production of meaningful sequences following random mutations of the genotype. We base our conclusions on an analysis of the population dynamics of a variant of Kitano's neurogenetic model wherein the chromosome encodes the rules for cellular division and the phenotype is a 16-cell organism interpreted as a connectivity matrix fo...

  17. Transethnic Genetic-Correlation Estimates from Summary Statistics.

    Science.gov (United States)

    Brown, Brielin C; Ye, Chun Jimmie; Price, Alkes L; Zaitlen, Noah

    2016-07-07

    The increasing number of genetic association studies conducted in multiple populations provides an unprecedented opportunity to study how the genetic architecture of complex phenotypes varies between populations, a problem important for both medical and population genetics. Here, we have developed a method for estimating the transethnic genetic correlation: the correlation of causal-variant effect sizes at SNPs common in populations. This methods takes advantage of the entire spectrum of SNP associations and uses only summary-level data from genome-wide association studies. This avoids the computational costs and privacy concerns associated with genotype-level information while remaining scalable to hundreds of thousands of individuals and millions of SNPs. We applied our method to data on gene expression, rheumatoid arthritis, and type 2 diabetes and overwhelmingly found that the genetic correlation was significantly less than 1. Our method is implemented in a Python package called Popcorn.

  18. Potential of Microsatellites Markers for the Genetic Analysis of Bryophytes

    Directory of Open Access Journals (Sweden)

    Saumy PANDEY

    2016-03-01

    Full Text Available Microsatellites have increasingly being used to study genetic diversity, phylogeny, population genetics, population ecology and genetic mapping of bryophytes. Due to co-dominant and highly reproducible features, microsatellites became markers of choice for several genetic analyses of bryophytes. However, the major limitation is de novo isolation of microsatellites from the interest species which were studied and gave genomic libraries. Initially, traditional methods of microsatellite development were tedious and time consuming, but due to the sequencing of several bryophytes available in public databases, advancement in PCR technologies and computer software, have cumulatively facilitated the development of microsatellites for bryophytes study. This review examines the features, strategies for the development of microsatellites and their utilization in many aspects of genetic and ecological studies of bryophytes.

  19. [Investigation on the integrative course of genetics and genomics].

    Science.gov (United States)

    Liu, Zhi-Xiang; Xu, Gang-Biao; Zeng, Chao-Zhen; Wang, Ai-Yun; Wu, Ruo-Yan

    2011-07-01

    Genomics is an important subdiscipline of genetics, and it forms a complete research system based on novel theories and techniques. Incorporating genomics in undergraduate curriculum is a response to the need of the development of genetics. The teaching of genomics has significant advantages on developing scientific thinking, enhances bioethics accomplishment, and professional interests in undergraduate students. The integration of genomics into genetics is in accordance with the principles of subject development and education. Related textbooks for undergraduate education are currently available in China, and it is feasible to set up a genetics and genomics integrative course by modifying teaching contents of the genetics course, selecting appropriate teaching approaches, and optimal application of the computer-assisted instruction.

  20. Edge detection of range images using genetic neural networks

    Institute of Scientific and Technical Information of China (English)

    FAN Jian-ying; DU Ying; ZHOU Yang; WANG Yang

    2009-01-01

    Due to the complexity and asymmetrical illumination, the images of object are difficult to be effectively segmented by some routine method. In this paper, a kind of edge detection method based on image features and genetic algorithms neural network for range images was proposed. Fully considering the essential difference between an edge point and a noise point, some characteristic parameters were extracted from range maps as the input nodes of the network in the algorithm. Firstly, a genetic neural network was designed and implemented. The neural network is trained by genetic algorithm, and then genetic neural network algorithm is combined with the virtue of global optimization of genetic algorithm and the virtue of parallel computation of neural network, so that this algorithm is of good global property. The experimental results show that this method can get much faster and more accurate detection results than the classical differential algorithm, and has better anti-noise performance.