WorldWideScience

Sample records for system-level genetic codes

  1. What Froze the Genetic Code?

    Directory of Open Access Journals (Sweden)

    Lluís Ribas de Pouplana

    2017-04-01

    Full Text Available The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  2. What Froze the Genetic Code?

    Science.gov (United States)

    Ribas de Pouplana, Lluís; Torres, Adrian Gabriel; Rafels-Ybern, Àlbert

    2017-04-05

    The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  3. Genetic coding and gene expression - new Quadruplet genetic coding model

    Science.gov (United States)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  4. Computation of the Genetic Code

    Science.gov (United States)

    Kozlov, Nicolay N.; Kozlova, Olga N.

    2018-03-01

    One of the problems in the development of mathematical theory of the genetic code (summary is presented in [1], the detailed -to [2]) is the problem of the calculation of the genetic code. Similar problems in the world is unknown and could be delivered only in the 21st century. One approach to solving this problem is devoted to this work. For the first time provides a detailed description of the method of calculation of the genetic code, the idea of which was first published earlier [3]), and the choice of one of the most important sets for the calculation was based on an article [4]. Such a set of amino acid corresponds to a complete set of representations of the plurality of overlapping triple gene belonging to the same DNA strand. A separate issue was the initial point, triggering an iterative search process all codes submitted by the initial data. Mathematical analysis has shown that the said set contains some ambiguities, which have been founded because of our proposed compressed representation of the set. As a result, the developed method of calculation was limited to the two main stages of research, where the first stage only the of the area were used in the calculations. The proposed approach will significantly reduce the amount of computations at each step in this complex discrete structure.

  5. Evolutionary implications of genetic code deviations

    International Nuclear Information System (INIS)

    Chela Flores, J.

    1986-07-01

    By extending the standard genetic code into a temperature dependent regime, we propose a train of molecular events leading to alternative coding. The first few examples of these deviations have already been reported in some ciliated protozoans and Gram positive bacteria. A possible range of further alternative coding, still within the context of universality, is pointed out. (author)

  6. System Level Evaluation of Innovative Coded MIMO-OFDM Systems for Broadcasting Digital TV

    Directory of Open Access Journals (Sweden)

    Y. Nasser

    2008-01-01

    Full Text Available Single-frequency networks (SFNs for broadcasting digital TV is a topic of theoretical and practical interest for future broadcasting systems. Although progress has been made in the characterization of its description, there are still considerable gaps in its deployment with MIMO technique. The contribution of this paper is multifold. First, we investigate the possibility of applying a space-time (ST encoder between the antennas of two sites in SFN. Then, we introduce a 3D space-time-space block code for future terrestrial digital TV in SFN architecture. The proposed 3D code is based on a double-layer structure designed for intercell and intracell space time-coded transmissions. Eventually, we propose to adapt a technique called effective exponential signal-to-noise ratio (SNR mapping (EESM to predict the bit error rate (BER at the output of the channel decoder in the MIMO systems. The EESM technique as well as the simulations results will be used to doubly check the efficiency of our 3D code. This efficiency is obtained for equal and unequal received powers whatever is the location of the receiver by adequately combining ST codes. The 3D code is then a very promising candidate for SFN architecture with MIMO transmission.

  7. The Genetic Code: Yesterday, Today and Tomorrow

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 17; Issue 12. The Genetic Code: Yesterday, Today and Tomorrow. Jiqiang Ling Dieter Söll. General Article Volume 17 Issue 12 December 2012 pp 1136-1142. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. Manchester Coding Option for SpaceWire: Providing Choices for System Level Design

    Science.gov (United States)

    Rakow, Glenn; Kisin, Alex

    2014-01-01

    This paper proposes an optional coding scheme for SpaceWire in lieu of the current Data Strobe scheme for three reasons. First reason is to provide a straightforward method for electrical isolation of the interface; secondly to provide ability to reduce the mass and bend radius of the SpaceWire cable; and thirdly to provide a means for a common physical layer over which multiple spacecraft onboard data link protocols could operate for a wide range of data rates. The intent is to accomplish these goals without significant change to existing SpaceWire design investments. The ability to optionally use Manchester coding in place of the current Data Strobe coding provides the ability to DC balanced the signal transitions unlike the SpaceWire Data Strobe coding; and therefore the ability to isolate the electrical interface without concern. Additionally, because the Manchester code has the clock and data encoded on the same signal, the number of wires of the existing SpaceWire cable could be optionally reduced by 50. This reduction could be an important consideration for many users of SpaceWire as indicated by the already existing effort underway by the SpaceWire working group to reduce the cable mass and bend radius by elimination of shields. However, reducing the signal count by half would provide even greater gains. It is proposed to restrict the data rate for the optional Manchester coding to a fixed data rate of 10 Megabits per second (Mbps) in order to make the necessary changes simple and still able to run in current radiation tolerant Field Programmable Gate Arrays (FPGAs). Even with this constraint, 10 Mbps will meet many applications where SpaceWire is used. These include command and control applications and many instruments applications with have moderate data rate. For most NASA flight implementations, SpaceWire designs are in rad-tolerant FPGAs, and the desire to preserve the heritage design investment is important for cost and risk considerations. The

  9. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  10. Representation mutations from standard genetic codes

    Science.gov (United States)

    Aisah, I.; Suyudi, M.; Carnia, E.; Suhendi; Supriatna, A. K.

    2018-03-01

    Graph is widely used in everyday life especially to describe model problem and describe it concretely and clearly. In addition graph is also used to facilitate solve various kinds of problems that are difficult to be solved by calculation. In Biology, graph can be used to describe the process of protein synthesis in DNA. Protein has an important role for DNA (deoxyribonucleic acid) or RNA (ribonucleic acid). Proteins are composed of amino acids. In this study, amino acids are related to genetics, especially the genetic code. The genetic code is also known as the triplet or codon code which is a three-letter arrangement of DNA nitrogen base. The bases are adenine (A), thymine (T), guanine (G) and cytosine (C). While on RNA thymine (T) is replaced with Urasil (U). The set of all Nitrogen bases in RNA is denoted by N = {C U, A, G}. This codon works at the time of protein synthesis inside the cell. This codon also encodes the stop signal as a sign of the stop of protein synthesis process. This paper will examine the process of protein synthesis through mathematical studies and present it in three-dimensional space or graph. The study begins by analysing the set of all codons denoted by NNN such that to obtain geometric representations. At this stage there is a matching between the sets of all nitrogen bases N with Z 2 × Z 2; C=(\\overline{0},\\overline{0}),{{U}}=(\\overline{0},\\overline{1}),{{A}}=(\\overline{1},\\overline{0}),{{G}}=(\\overline{1},\\overline{1}). By matching the algebraic structure will be obtained such as group, group Klein-4,Quotien group etc. With the help of Geogebra software, the set of all codons denoted by NNN can be presented in a three-dimensional space as a multicube NNN and also can be represented as a graph, so that can easily see relationship between the codon.

  11. HOW TO REPRESENT THE GENETIC CODE?

    Directory of Open Access Journals (Sweden)

    N.S. Santos-Magalhães

    2004-05-01

    Full Text Available The advent of molecular genetic comprises a true revolution of far-reaching consequences for human-kind, which evolved into a specialized branch of the modern-day Biochemistry. The analysis of specicgenomic information are gaining wide-ranging interest because of their signicance to the early diag-nosis of disease, and the discovery of modern drugs. In order to take advantage of a wide assortmentof signal processing (SP algorithms, the primary step of modern genomic SP involves convertingsymbolic-DNA sequences into complex-valued signals. How to represent the genetic code? Despitebeing extensively known, the DNA mapping into proteins is one of the relevant discoveries of genetics.The genetic code (GC is revisited in this work, addressing other descriptions for it, which can beworthy for genomic SP. Three original representations are discussed. The inner-to-outer map buildson the unbalanced role of nucleotides of a codon. A two-dimensional-Gray genetic representationis oered as a structured map that can help interpreting DNA spectrograms or scalograms. Theseare among the powerful visual tools for genome analysis, which depends on the choice of the geneticmapping. Finally, the world-chart for the GC is investigated. Evoking the cyclic structure of thegenetic mapping, it can be folded joining the left-right borders, and the top-bottom frontiers. As aresult, the GC can be drawn on the surface of a sphere resembling a world-map. Eight parallels oflatitude are required (four in each hemisphere as well as four meridians of longitude associated tofour corresponding anti-meridians. The tropic circles have 11.25o, 33.75o, 56.25o, and 78.5o (Northand South. Starting from an arbitrary Greenwich meridian, the meridians of longitude can be plottedat 22.5o, 67.5o, 112.5o, and 157.5o (East and West. Each triplet is assigned to a single point on thesurface that we named Nirenberg-Kohamas Earth. Despite being valuable, usual representations forthe GC can be

  12. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  13. Flexibility of the genetic code with respect to DNA structure

    DEFF Research Database (Denmark)

    Baisnée, P. F.; Baldi, Pierre; Brunak, Søren

    2001-01-01

    Motivation. The primary function of DNA is to carry genetic information through the genetic code. DNA, however, contains a variety of other signals related, for instance, to reading frame, codon bias, pairwise codon bias, splice sites and transcription regulation, nucleosome positioning and DNA...... structure. Here we study the relationship between the genetic code and DNA structure and address two questions. First, to which degree does the degeneracy of the genetic code and the acceptable amino acid substitution patterns allow for the superimposition of DNA structural signals to protein coding...... sequences? Second, is the origin or evolution of the genetic code likely to have been constrained by DNA structure? Results. We develop an index for code flexibility with respect to DNA structure. Using five different di- or tri-nucleotide models of sequence-dependent DNA structure, we show...

  14. A multiobjective approach to the genetic code adaptability problem.

    Science.gov (United States)

    de Oliveira, Lariza Laura; de Oliveira, Paulo S L; Tinós, Renato

    2015-02-19

    The organization of the canonical code has intrigued researches since it was first described. If we consider all codes mapping the 64 codes into 20 amino acids and one stop codon, there are more than 1.51×10(84) possible genetic codes. The main question related to the organization of the genetic code is why exactly the canonical code was selected among this huge number of possible genetic codes. Many researchers argue that the organization of the canonical code is a product of natural selection and that the code's robustness against mutations would support this hypothesis. In order to investigate the natural selection hypothesis, some researches employ optimization algorithms to identify regions of the genetic code space where best codes, according to a given evaluation function, can be found (engineering approach). The optimization process uses only one objective to evaluate the codes, generally based on the robustness for an amino acid property. Only one objective is also employed in the statistical approach for the comparison of the canonical code with random codes. We propose a multiobjective approach where two or more objectives are considered simultaneously to evaluate the genetic codes. In order to test our hypothesis that the multiobjective approach is useful for the analysis of the genetic code adaptability, we implemented a multiobjective optimization algorithm where two objectives are simultaneously optimized. Using as objectives the robustness against mutation with the amino acids properties polar requirement (objective 1) and robustness with respect to hydropathy index or molecular volume (objective 2), we found solutions closer to the canonical genetic code in terms of robustness, when compared with the results using only one objective reported by other authors. Using more objectives, more optimal solutions are obtained and, as a consequence, more information can be used to investigate the adaptability of the genetic code. The multiobjective approach

  15. National Society of Genetic Counselors Code of Ethics.

    Science.gov (United States)

    2018-02-01

    This document is the revised Code of Ethics of the National Society of Genetic Counselors (NSGC) that was adopted in April 2017 after majority vote of the full membership of the NSGC. The explication of the revisions is published in this volume of the Journal of Genetic Counseling. This is the fourth revision to the Code of Ethics since its original adoption in 1992.

  16. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  17. The evolution of the mitochondrial genetic code in arthropods revisited.

    Science.gov (United States)

    Abascal, Federico; Posada, David; Zardoya, Rafael

    2012-04-01

    A variant of the invertebrate mitochondrial genetic code was previously identified in arthropods (Abascal et al. 2006a, PLoS Biol 4:e127) in which, instead of translating the AGG codon as serine, as in other invertebrates, some arthropods translate AGG as lysine. Here, we revisit the evolution of the genetic code in arthropods taking into account that (1) the number of arthropod mitochondrial genomes sequenced has triplicated since the original findings were published; (2) the phylogeny of arthropods has been recently resolved with confidence for many groups; and (3) sophisticated probabilistic methods can be applied to analyze the evolution of the genetic code in arthropod mitochondria. According to our analyses, evolutionary shifts in the genetic code have been more common than previously inferred, with many taxonomic groups displaying two alternative codes. Ancestral character-state reconstruction using probabilistic methods confirmed that the arthropod ancestor most likely translated AGG as lysine. Point mutations at tRNA-Lys and tRNA-Ser correlated with the meaning of the AGG codon. In addition, we identified three variables (GC content, number of AGG codons, and taxonomic information) that best explain the use of each of the two alternative genetic codes.

  18. Mathematical fundamentals for the noise immunity of the genetic code.

    Science.gov (United States)

    Fimmel, Elena; Strüngmann, Lutz

    2018-02-01

    Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of

  19. Unnatural reactive amino acid genetic code additions

    Energy Technology Data Exchange (ETDEWEB)

    Deiters, Alexander; Cropp, T. Ashton; Chin, Jason W.; Anderson, Christopher J.; Schultz, Peter G.

    2017-10-25

    This invention provides compositions and methods for producing translational components that expand the number of genetically encoded amino acids in eukaryotic cells. The components include orthogonal tRNAs, orthogonal aminoacyl-tRNA synthetases, orthogonal pairs of tRNAs/synthetases and unnatural amino acids. Proteins and methods of producing proteins with unnatural amino acids in eukaryotic cells are also provided.

  20. Quantum algorithms and the genetic code

    Indian Academy of Sciences (India)

    the process of replication. One generation of organisms produces the next generation, which is essentially a copy of itself. The self-similarity is maintained by the hereditary information—the genetic code—that is passed on from one generation to the next. The long chains of DNA molecules residing in the nuclei of the cells ...

  1. Systems Level Dissection of Anaerobic Methane Cycling: Quantitative Measurements of Single Cell Ecophysiology, Genetic Mechanisms, and Microbial Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Orphan, Victoria [California Inst. of Technology (CalTech), Pasadena, CA (United States); Tyson, Gene [University of Queensland, Brisbane Australia; Meile, Christof [University of Georgia, Athens, Georgia; McGlynn, Shawn [California Inst. of Technology (CalTech), Pasadena, CA (United States); Yu, Hang [California Inst. of Technology (CalTech), Pasadena, CA (United States); Chadwick, Grayson [California Inst. of Technology (CalTech), Pasadena, CA (United States); Marlow, Jeffrey [California Inst. of Technology (CalTech), Pasadena, CA (United States); Trembath-Reichert, Elizabeth [California Inst. of Technology (CalTech), Pasadena, CA (United States); Dekas, Anne [California Inst. of Technology (CalTech), Pasadena, CA (United States); Hettich, Robert [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pan, Chongle [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ellisman, Mark [University of California San Diego; Hatzenpichler, Roland [California Inst. of Technology (CalTech), Pasadena, CA (United States); Skennerton, Connor [California Inst. of Technology (CalTech), Pasadena, CA (United States); Scheller, Silvan [California Inst. of Technology (CalTech), Pasadena, CA (United States)

    2017-12-25

    The global biological CH4 cycle is largely controlled through coordinated and often intimate microbial interactions between archaea and bacteria, the majority of which are still unknown or have been only cursorily identified. Members of the methanotrophic archaea, aka ‘ANME’, are believed to play a major role in the cycling of methane in anoxic environments coupled to sulfate, nitrate, and possibly iron and manganese oxides, frequently forming diverse physical and metabolic partnerships with a range of bacteria. The thermodynamic challenges overcome by the ANME and their bacterial partners and corresponding slow rates of growth are common characteristics in anaerobic ecosystems, and, in stark contrast to most cultured microorganisms, this type of energy and resource limited microbial lifestyle is likely the norm in the environment. While we have gained an in-depth systems level understanding of fast-growing, energy-replete microorganisms, comparatively little is known about the dynamics of cell respiration, growth, protein turnover, gene expression, and energy storage in the slow-growing microbial majority. These fundamental properties, combined with the observed metabolic and symbiotic versatility of methanotrophic ANME, make these cooperative microbial systems a relevant (albeit challenging) system to study and for which to develop and optimize culture-independent methodologies, which enable a systems-level understanding of microbial interactions and metabolic networks. We used an integrative systems biology approach to study anaerobic sediment microcosms and methane-oxidizing bioreactors and expanded our understanding of the methanotrophic ANME archaea, their interactions with physically-associated bacteria, ecophysiological characteristics, and underlying genetic basis for cooperative microbial methane-oxidation linked with different terminal electron acceptors. Our approach is inherently multi-disciplinary and multi-scaled, combining transcriptional and

  2. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  3. Real coded genetic algorithm for fuzzy time series prediction

    Science.gov (United States)

    Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.

    2017-10-01

    Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.

  4. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang

    2011-06-07

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  5. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang; Yu, Jun

    2011-01-01

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  6. Origins of gene, genetic code, protein and life

    Indian Academy of Sciences (India)

    Unknown

    have concluded that newly-born genes are products of nonstop frames (NSF) ... research to determine tertiary structures of proteins such ... the present earth, is favourable for new genes to arise, if ..... NGG) in the universal genetic code table, cannot satisfy ..... which has been proposed to explain the development of life on.

  7. The Search for Symmetries in the Genetic Code:

    Science.gov (United States)

    Antoneli, Fernando; Forger, Michael; Hornos, José Eduardo M.

    We give a full classification of the possible schemes for obtaining the distribution of multiplets observed in the standard genetic code by symmetry breaking in the context of finite groups, based on an extended notion of partial symmetry breaking that incorporates the intuitive idea of "freezing" first proposed by Francis Crick, which is given a precise mathematical meaning.

  8. CMCpy: Genetic Code-Message Coevolution Models in Python

    Science.gov (United States)

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  9. Programming peptidomimetic syntheses by translating genetic codes designed de novo.

    Science.gov (United States)

    Forster, Anthony C; Tan, Zhongping; Nalam, Madhavi N L; Lin, Hening; Qu, Hui; Cornish, Virginia W; Blacklow, Stephen C

    2003-05-27

    Although the universal genetic code exhibits only minor variations in nature, Francis Crick proposed in 1955 that "the adaptor hypothesis allows one to construct, in theory, codes of bewildering variety." The existing code has been expanded to enable incorporation of a variety of unnatural amino acids at one or two nonadjacent sites within a protein by using nonsense or frameshift suppressor aminoacyl-tRNAs (aa-tRNAs) as adaptors. However, the suppressor strategy is inherently limited by compatibility with only a small subset of codons, by the ways such codons can be combined, and by variation in the efficiency of incorporation. Here, by preventing competing reactions with aa-tRNA synthetases, aa-tRNAs, and release factors during translation and by using nonsuppressor aa-tRNA substrates, we realize a potentially generalizable approach for template-encoded polymer synthesis that unmasks the substantially broader versatility of the core translation apparatus as a catalyst. We show that several adjacent, arbitrarily chosen sense codons can be completely reassigned to various unnatural amino acids according to de novo genetic codes by translating mRNAs into specific peptide analog polymers (peptidomimetics). Unnatural aa-tRNA substrates do not uniformly function as well as natural substrates, revealing important recognition elements for the translation apparatus. Genetic programming of peptidomimetic synthesis should facilitate mechanistic studies of translation and may ultimately enable the directed evolution of small molecules with desirable catalytic or pharmacological properties.

  10. On coding genotypes for genetic markers with multiple alleles in genetic association study of quantitative traits

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2011-09-01

    Full Text Available Abstract Background In genetic association study of quantitative traits using F∞ models, how to code the marker genotypes and interpret the model parameters appropriately is important for constructing hypothesis tests and making statistical inferences. Currently, the coding of marker genotypes in building F∞ models has mainly focused on the biallelic case. A thorough work on the coding of marker genotypes and interpretation of model parameters for F∞ models is needed especially for genetic markers with multiple alleles. Results In this study, we will formulate F∞ genetic models under various regression model frameworks and introduce three genotype coding schemes for genetic markers with multiple alleles. Starting from an allele-based modeling strategy, we first describe a regression framework to model the expected genotypic values at given markers. Then, as extension from the biallelic case, we introduce three coding schemes for constructing fully parameterized one-locus F∞ models and discuss the relationships between the model parameters and the expected genotypic values. Next, under a simplified modeling framework for the expected genotypic values, we consider several reduced one-locus F∞ models from the three coding schemes on the estimability and interpretation of their model parameters. Finally, we explore some extensions of the one-locus F∞ models to two loci. Several fully parameterized as well as reduced two-locus F∞ models are addressed. Conclusions The genotype coding schemes provide different ways to construct F∞ models for association testing of multi-allele genetic markers with quantitative traits. Which coding scheme should be applied depends on how convenient it can provide the statistical inferences on the parameters of our research interests. Based on these F∞ models, the standard regression model fitting tools can be used to estimate and test for various genetic effects through statistical contrasts with the

  11. The genetic code as a periodic table: algebraic aspects.

    Science.gov (United States)

    Bashford, J D; Jarvis, P D

    2000-01-01

    The systematics of indices of physico-chemical properties of codons and amino acids across the genetic code are examined. Using a simple numerical labelling scheme for nucleic acid bases, A=(-1,0), C=(0,-1), G=(0,1), U=(1,0), data can be fitted as low order polynomials of the six coordinates in the 64-dimensional codon weight space. The work confirms and extends the recent studies by Siemion et al. (1995. BioSystems 36, 231-238) of the conformational parameters. Fundamental patterns in the data such as codon periodicities, and related harmonics and reflection symmetries, are here associated with the structure of the set of basis monomials chosen for fitting. Results are plotted using the Siemion one-step mutation ring scheme, and variants thereof. The connections between the present work, and recent studies of the genetic code structure using dynamical symmetry algebras, are pointed out.

  12. The "Wow! signal" of the terrestrial genetic code

    Science.gov (United States)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of

  13. A symbiotic liaison between the genetic and epigenetic code

    Directory of Open Access Journals (Sweden)

    Holger eHeyn

    2014-05-01

    Full Text Available With rapid advances in sequencing technologies, we are undergoing a paradigm shift from hypothesis- to data-driven research. Genome-wide profiling efforts gave informative insights into biological processes; however, considering the wealth of variation, the major challenge remains their meaningful interpretation. In particular sequence variation in non-coding contexts is often challenging to interpret. Here, data integration approaches for the identification of functional genetic variability represent a likely solution. Exemplary, functional linkage analysis integrating genotype and expression data determined regulatory quantitative trait loci (QTL and proposed causal relationships. In addition to gene expression, epigenetic regulation and specifically DNA methylation was established as highly valuable surrogate mark for functional variance of the genetic code. Epigenetic modification served as powerful mediator trait to elucidate mechanisms forming phenotypes in health and disease. Particularly, integrative studies of genetic and DNA methylation data yet guided interpretation strategies of risk genotypes, but also proved their value for physiological traits, such as natural human variation and aging. This Perspective seeks to illustrate the power of data integration in the genomic era exemplified by DNA methylation quantitative trait loci (meQTLs. However, the model is further extendable to virtually all traceable molecular traits.

  14. On Francis Crick, the genetic code, and a clever kid.

    Science.gov (United States)

    Goldstein, Bob

    2018-04-02

    A few years ago, Francis Crick's son told me a story that I can't get out of my mind. I had contacted Michael Crick by email while digging through the background of the researchers who had cracked the genetic code in the 1960s. Francis had died in 2004, and I was contacting some of the people who knew him when he was struggling to decipher the code. Francis didn't appear to struggle often - he is known mostly for his successes - and, as it turns out, this one well-known struggle may have had a clue sitting just barely out of sight. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Quantum control using genetic algorithms in quantum communication: superdense coding

    International Nuclear Information System (INIS)

    Domínguez-Serna, Francisco; Rojas, Fernando

    2015-01-01

    We present a physical example model of how Quantum Control with genetic algorithms is applied to implement the quantum superdense code protocol. We studied a model consisting of two quantum dots with an electron with spin, including spin-orbit interaction. The electron and the spin get hybridized with the site acquiring two degrees of freedom, spin and charge. The system has tunneling and site energies as time dependent control parameters that are optimized by means of genetic algorithms to prepare a hybrid Bell-like state used as a transmission channel. This state is transformed to obtain any state of the four Bell basis as required by superdense protocol to transmit two bits of classical information. The control process protocol is equivalent to implement one of the quantum gates in the charge subsystem. Fidelities larger than 99.5% are achieved for the hybrid entangled state preparation and the superdense operations. (paper)

  16. Systems level analysis of systemic sclerosis shows a network of immune and profibrotic pathways connected with genetic polymorphisms.

    Directory of Open Access Journals (Sweden)

    J Matthew Mahoney

    2015-01-01

    Full Text Available Systemic sclerosis (SSc is a rare systemic autoimmune disease characterized by skin and organ fibrosis. The pathogenesis of SSc and its progression are poorly understood. The SSc intrinsic gene expression subsets (inflammatory, fibroproliferative, normal-like, and limited are observed in multiple clinical cohorts of patients with SSc. Analysis of longitudinal skin biopsies suggests that a patient's subset assignment is stable over 6-12 months. Genetically, SSc is multi-factorial with many genetic risk loci for SSc generally and for specific clinical manifestations. Here we identify the genes consistently associated with the intrinsic subsets across three independent cohorts, show the relationship between these genes using a gene-gene interaction network, and place the genetic risk loci in the context of the intrinsic subsets. To identify gene expression modules common to three independent datasets from three different clinical centers, we developed a consensus clustering procedure based on mutual information of partitions, an information theory concept, and performed a meta-analysis of these genome-wide gene expression datasets. We created a gene-gene interaction network of the conserved molecular features across the intrinsic subsets and analyzed their connections with SSc-associated genetic polymorphisms. The network is composed of distinct, but interconnected, components related to interferon activation, M2 macrophages, adaptive immunity, extracellular matrix remodeling, and cell proliferation. The network shows extensive connections between the inflammatory- and fibroproliferative-specific genes. The network also shows connections between these subset-specific genes and 30 SSc-associated polymorphic genes including STAT4, BLK, IRF7, NOTCH4, PLAUR, CSK, IRAK1, and several human leukocyte antigen (HLA genes. Our analyses suggest that the gene expression changes underlying the SSc subsets may be long-lived, but mechanistically interconnected

  17. Amino acid fermentation at the origin of the genetic code

    Directory of Open Access Journals (Sweden)

    de Vladar Harold P

    2012-02-01

    Full Text Available Abstract There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can

  18. Amino acid fermentation at the origin of the genetic code.

    Science.gov (United States)

    de Vladar, Harold P

    2012-02-10

    There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments

  19. Amino acid fermentation at the origin of the genetic code

    Science.gov (United States)

    2012-01-01

    There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments

  20. Decoding the non-coding genome: elucidating genetic risk outside the coding genome.

    Science.gov (United States)

    Barr, C L; Misener, V L

    2016-01-01

    Current evidence emerging from genome-wide association studies indicates that the genetic underpinnings of complex traits are likely attributable to genetic variation that changes gene expression, rather than (or in combination with) variation that changes protein-coding sequences. This is particularly compelling with respect to psychiatric disorders, as genetic changes in regulatory regions may result in differential transcriptional responses to developmental cues and environmental/psychosocial stressors. Until recently, however, the link between transcriptional regulation and psychiatric genetic risk has been understudied. Multiple obstacles have contributed to the paucity of research in this area, including challenges in identifying the positions of remote (distal from the promoter) regulatory elements (e.g. enhancers) and their target genes and the underrepresentation of neural cell types and brain tissues in epigenome projects - the availability of high-quality brain tissues for epigenetic and transcriptome profiling, particularly for the adolescent and developing brain, has been limited. Further challenges have arisen in the prediction and testing of the functional impact of DNA variation with respect to multiple aspects of transcriptional control, including regulatory-element interaction (e.g. between enhancers and promoters), transcription factor binding and DNA methylation. Further, the brain has uncommon DNA-methylation marks with unique genomic distributions not found in other tissues - current evidence suggests the involvement of non-CG methylation and 5-hydroxymethylation in neurodevelopmental processes but much remains unknown. We review here knowledge gaps as well as both technological and resource obstacles that will need to be overcome in order to elucidate the involvement of brain-relevant gene-regulatory variants in genetic risk for psychiatric disorders. © 2015 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  1. Mapping the Plasticity of the E. coli Genetic Code with Orthogonal Pair Directed Sense Codon Reassignment.

    Science.gov (United States)

    Schmitt, Margaret A; Biddle, Wil; Fisk, John Domenic

    2018-04-18

    The relative quantitative importance of the factors that determine the fidelity of translation is largely unknown, which makes predicting the extent to which the degeneracy of the genetic code can be broken challenging. Our strategy of using orthogonal tRNA/aminoacyl tRNA synthetase pairs to precisely direct the incorporation of a single amino acid in response to individual sense and nonsense codons provides a suite of related data with which to examine the plasticity of the code. Each directed sense codon reassignment measurement is an in vivo competition experiment between the introduced orthogonal translation machinery and the natural machinery in E. coli. This report discusses 20 new, related genetic codes, in which a targeted E. coli wobble codon is reassigned to tyrosine utilizing the orthogonal tyrosine tRNA/aminoacyl tRNA synthetase pair from Methanocaldococcus jannaschii. One at a time, reassignment of each targeted sense codon to tyrosine is quantified in cells by measuring the fluorescence of GFP variants in which the essential tyrosine residue is encoded by a non-tyrosine codon. Significantly, every wobble codon analyzed may be partially reassigned with efficiencies ranging from 0.8% to 41%. The accumulation of the suite of data enables a qualitative dissection of the relative importance of the factors affecting the fidelity of translation. While some correlation was observed between sense codon reassignment and either competing endogenous tRNA abundance or changes in aminoacylation efficiency of the altered orthogonal system, no single factor appears to predominately drive translational fidelity. Evaluation of relative cellular fitness in each of the 20 quantitatively-characterized proteome-wide tyrosine substitution systems suggests that at a systems level, E. coli is robust to missense mutations.

  2. Arbitrariness is not enough: towards a functional approach to the genetic code.

    Science.gov (United States)

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  3. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments.

    Science.gov (United States)

    Santos, José; Monteagudo, Angel

    2011-02-21

    As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the fact that the best possible codes show the patterns of the

  4. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments

    Directory of Open Access Journals (Sweden)

    Monteagudo Ángel

    2011-02-01

    Full Text Available Abstract Background As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Results Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Conclusions Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the

  5. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    Science.gov (United States)

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Genetic and systems level analysis of Drosophila sticky/citron kinase and dFmr1 mutants reveals common regulation of genetic networks

    Directory of Open Access Journals (Sweden)

    Zarnescu Daniela C

    2008-11-01

    Full Text Available Abstract Background In Drosophila, the genes sticky and dFmr1 have both been shown to regulate cytoskeletal dynamics and chromatin structure. These genes also genetically interact with Argonaute family microRNA regulators. Furthermore, in mammalian systems, both genes have been implicated in neuronal development. Given these genetic and functional similarities, we tested Drosophila sticky and dFmr1 for a genetic interaction and measured whole genome expression in both mutants to assess similarities in gene regulation. Results We found that sticky mutations can dominantly suppress a dFmr1 gain-of-function phenotype in the developing eye, while phenotypes produced by RNAi knock-down of sticky were enhanced by dFmr1 RNAi and a dFmr1 loss-of-function mutation. We also identified a large number of transcripts that were misexpressed in both mutants suggesting that sticky and dFmr1 gene products similarly regulate gene expression. By integrating gene expression data with a protein-protein interaction network, we found that mutations in sticky and dFmr1 resulted in misexpression of common gene networks, and consequently predicted additional specific phenotypes previously not known to be associated with either gene. Further phenotypic analyses validated these predictions. Conclusion These findings establish a functional link between two previously unrelated genes. Microarray analysis indicates that sticky and dFmr1 are both required for regulation of many developmental genes in a variety of cell types. The diversity of transcripts regulated by these two genes suggests a clear cause of the pleiotropy that sticky and dFmr1 mutants display and provides many novel, testable hypotheses about the functions of these genes. As both of these genes are implicated in the development and function of the mammalian brain, these results have relevance to human health as well as to understanding more general biological processes.

  7. A genetic code alteration is a phenotype diversity generator in the human pathogen Candida albicans.

    Directory of Open Access Journals (Sweden)

    Isabel Miranda

    Full Text Available BACKGROUND: The discovery of genetic code alterations and expansions in both prokaryotes and eukaryotes abolished the hypothesis of a frozen and universal genetic code and exposed unanticipated flexibility in codon and amino acid assignments. It is now clear that codon identity alterations involve sense and non-sense codons and can occur in organisms with complex genomes and proteomes. However, the biological functions, the molecular mechanisms of evolution and the diversity of genetic code alterations remain largely unknown. In various species of the genus Candida, the leucine CUG codon is decoded as serine by a unique serine tRNA that contains a leucine 5'-CAG-3'anticodon (tRNA(CAG(Ser. We are using this codon identity redefinition as a model system to elucidate the evolution of genetic code alterations. METHODOLOGY/PRINCIPAL FINDINGS: We have reconstructed the early stages of the Candida genetic code alteration by engineering tRNAs that partially reverted the identity of serine CUG codons back to their standard leucine meaning. Such genetic code manipulation had profound cellular consequences as it exposed important morphological variation, altered gene expression, re-arranged the karyotype, increased cell-cell adhesion and secretion of hydrolytic enzymes. CONCLUSION/SIGNIFICANCE: Our study provides the first experimental evidence for an important role of genetic code alterations as generators of phenotypic diversity of high selective potential and supports the hypothesis that they speed up evolution of new phenotypes.

  8. Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal

    Science.gov (United States)

    Zamudio, Gabriel S.; José, Marco V.

    2018-03-01

    In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.

  9. The "periodic table" of the genetic code: A new way to look at the code and the decoding process.

    Science.gov (United States)

    Komar, Anton A

    2016-01-01

    Henri Grosjean and Eric Westhof recently presented an information-rich, alternative view of the genetic code, which takes into account current knowledge of the decoding process, including the complex nature of interactions between mRNA, tRNA and rRNA that take place during protein synthesis on the ribosome, and it also better reflects the evolution of the code. The new asymmetrical circular genetic code has a number of advantages over the traditional codon table and the previous circular diagrams (with a symmetrical/clockwise arrangement of the U, C, A, G bases). Most importantly, all sequence co-variances can be visualized and explained based on the internal logic of the thermodynamics of codon-anticodon interactions.

  10. The coevolution of genes and genetic codes: Crick's frozen accident revisited.

    Science.gov (United States)

    Sella, Guy; Ardell, David H

    2006-09-01

    The standard genetic code is the nearly universal system for the translation of genes into proteins. The code exhibits two salient structural characteristics: it possesses a distinct organization that makes it extremely robust to errors in replication and translation, and it is highly redundant. The origin of these properties has intrigued researchers since the code was first discovered. One suggestion, which is the subject of this review, is that the code's organization is the outcome of the coevolution of genes and genetic codes. In 1968, Francis Crick explored the possible implications of coevolution at different stages of code evolution. Although he argues that coevolution was likely to influence the evolution of the code, he concludes that it falls short of explaining the organization of the code we see today. The recent application of mathematical modeling to study the effects of errors on the course of coevolution, suggests a different conclusion. It shows that coevolution readily generates genetic codes that are highly redundant and similar in their error-correcting organization to the standard code. We review this recent work and suggest that further affirmation of the role of coevolution can be attained by investigating the extent to which the outcome of coevolution is robust to other influences that were present during the evolution of the code.

  11. Codon size reduction as the origin of the triplet genetic code.

    Directory of Open Access Journals (Sweden)

    Pavel V Baranov

    Full Text Available The genetic code appears to be optimized in its robustness to missense errors and frameshift errors. In addition, the genetic code is near-optimal in terms of its ability to carry information in addition to the sequences of encoded proteins. As evolution has no foresight, optimality of the modern genetic code suggests that it evolved from less optimal code variants. The length of codons in the genetic code is also optimal, as three is the minimal nucleotide combination that can encode the twenty standard amino acids. The apparent impossibility of transitions between codon sizes in a discontinuous manner during evolution has resulted in an unbending view that the genetic code was always triplet. Yet, recent experimental evidence on quadruplet decoding, as well as the discovery of organisms with ambiguous and dual decoding, suggest that the possibility of the evolution of triplet decoding from living systems with non-triplet decoding merits reconsideration and further exploration. To explore this possibility we designed a mathematical model of the evolution of primitive digital coding systems which can decode nucleotide sequences into protein sequences. These coding systems can evolve their nucleotide sequences via genetic events of Darwinian evolution, such as point-mutations. The replication rates of such coding systems depend on the accuracy of the generated protein sequences. Computer simulations based on our model show that decoding systems with codons of length greater than three spontaneously evolve into predominantly triplet decoding systems. Our findings suggest a plausible scenario for the evolution of the triplet genetic code in a continuous manner. This scenario suggests an explanation of how protein synthesis could be accomplished by means of long RNA-RNA interactions prior to the emergence of the complex decoding machinery, such as the ribosome, that is required for stabilization and discrimination of otherwise weak triplet codon

  12. Critical roles for a genetic code alteration in the evolution of the genus Candida.

    Science.gov (United States)

    Silva, Raquel M; Paredes, João A; Moura, Gabriela R; Manadas, Bruno; Lima-Costa, Tatiana; Rocha, Rita; Miranda, Isabel; Gomes, Ana C; Koerkamp, Marian J G; Perrot, Michel; Holstege, Frank C P; Boucherie, Hélian; Santos, Manuel A S

    2007-10-31

    During the last 30 years, several alterations to the standard genetic code have been discovered in various bacterial and eukaryotic species. Sense and nonsense codons have been reassigned or reprogrammed to expand the genetic code to selenocysteine and pyrrolysine. These discoveries highlight unexpected flexibility in the genetic code, but do not elucidate how the organisms survived the proteome chaos generated by codon identity redefinition. In order to shed new light on this question, we have reconstructed a Candida genetic code alteration in Saccharomyces cerevisiae and used a combination of DNA microarrays, proteomics and genetics approaches to evaluate its impact on gene expression, adaptation and sexual reproduction. This genetic manipulation blocked mating, locked yeast in a diploid state, remodelled gene expression and created stress cross-protection that generated adaptive advantages under environmental challenging conditions. This study highlights unanticipated roles for codon identity redefinition during the evolution of the genus Candida, and strongly suggests that genetic code alterations create genetic barriers that speed up speciation.

  13. Genetic hotels for the standard genetic code: evolutionary analysis based upon novel three-dimensional algebraic models.

    Science.gov (United States)

    José, Marco V; Morgado, Eberto R; Govezensky, Tzipe

    2011-07-01

    Herein, we rigorously develop novel 3-dimensional algebraic models called Genetic Hotels of the Standard Genetic Code (SGC). We start by considering the primeval RNA genetic code which consists of the 16 codons of type RNY (purine-any base-pyrimidine). Using simple algebraic operations, we show how the RNA code could have evolved toward the current SGC via two different intermediate evolutionary stages called Extended RNA code type I and II. By rotations or translations of the subset RNY, we arrive at the SGC via the former (type I) or via the latter (type II), respectively. Biologically, the Extended RNA code type I, consists of all codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The Extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. Since the dimensions of remarkable subsets of the Genetic Hotels are not necessarily integer numbers, we also introduce the concept of algebraic fractal dimension. A general decoding function which maps each codon to its corresponding amino acid or the stop signals is also derived. The Phenotypic Hotel of amino acids is also illustrated. The proposed evolutionary paths are discussed in terms of the existing theories of the evolution of the SGC. The adoption of 3-dimensional models of the Genetic and Phenotypic Hotels will facilitate the understanding of the biological properties of the SGC.

  14. How American Nurses Association Code of Ethics informs genetic/genomic nursing.

    Science.gov (United States)

    Tluczek, Audrey; Twal, Marie E; Beamer, Laura Curr; Burton, Candace W; Darmofal, Leslie; Kracun, Mary; Zanni, Karen L; Turner, Martha

    2018-01-01

    Members of the Ethics and Public Policy Committee of the International Society of Nurses in Genetics prepared this article to assist nurses in interpreting the American Nurses Association (2015) Code of Ethics for Nurses with Interpretive Statements (Code) within the context of genetics/genomics. The Code explicates the nursing profession's norms and responsibilities in managing ethical issues. The nearly ubiquitous application of genetic/genomic technologies in healthcare poses unique ethical challenges for nursing. Therefore, authors conducted literature searches that drew from various professional resources to elucidate implications of the code in genetic/genomic nursing practice, education, research, and public policy. We contend that the revised Code coupled with the application of genomic technologies to healthcare creates moral obligations for nurses to continually refresh their knowledge and capacities to translate genetic/genomic research into evidence-based practice, assure the ethical conduct of scientific inquiry, and continually develop or revise national/international guidelines that protect the rights of individuals and populations within the context of genetics/genomics. Thus, nurses have an ethical responsibility to remain knowledgeable about advances in genetics/genomics and incorporate emergent evidence into their work.

  15. Junk DNA and the long non-coding RNA twist in cancer genetics

    NARCIS (Netherlands)

    H. Ling (Hui); K. Vincent; M. Pichler; R. Fodde (Riccardo); I. Berindan-Neagoe (Ioana); F.J. Slack (Frank); G.A. Calin (George)

    2015-01-01

    textabstractThe central dogma of molecular biology states that the flow of genetic information moves from DNA to RNA to protein. However, in the last decade this dogma has been challenged by new findings on non-coding RNAs (ncRNAs) such as microRNAs (miRNAs). More recently, long non-coding RNAs

  16. National Society of Genetic Counselors Code of Ethics: Explication of 2017 Revisions.

    Science.gov (United States)

    Senter, Leigha; Bennett, Robin L; Madeo, Anne C; Noblin, Sarah; Ormond, Kelly E; Schneider, Kami Wolfe; Swan, Kelli; Virani, Alice

    2018-02-01

    The Code of Ethics (COE) of the National Society of Genetic Counselors (NSGC) was adopted in 1992 and was later revised and adopted in 2006. In 2016, the NSGC Code of Ethics Review Task Force (COERTF) was convened to review the COE. The COERTF reviewed ethical codes written by other professional organizations and suggested changes that would better reflect the current and evolving nature of the genetic counseling profession. The COERTF received input from the society's legal counsel, Board of Directors, and members-at-large. A revised COE was proposed to the membership and approved and adopted in April 2017. The revisions and rationale for each are presented.

  17. Efficient Dual Domain Decoding of Linear Block Codes Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed Azouaoui

    2012-01-01

    Full Text Available A computationally efficient algorithm for decoding block codes is developed using a genetic algorithm (GA. The proposed algorithm uses the dual code in contrast to the existing genetic decoders in the literature that use the code itself. Hence, this new approach reduces the complexity of decoding the codes of high rates. We simulated our algorithm in various transmission channels. The performance of this algorithm is investigated and compared with competitor decoding algorithms including Maini and Shakeel ones. The results show that the proposed algorithm gives large gains over the Chase-2 decoding algorithm and reach the performance of the OSD-3 for some quadratic residue (QR codes. Further, we define a new crossover operator that exploits the domain specific information and compare it with uniform and two point crossover. The complexity of this algorithm is also discussed and compared to other algorithms.

  18. Open Genetic Code: on open source in the life sciences

    OpenAIRE

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first ...

  19. Symmetries in Genetic Systems and the Concept of Geno-Logical Coding

    Directory of Open Access Journals (Sweden)

    Sergey V. Petoukhov

    2016-12-01

    Full Text Available The genetic code of amino acid sequences in proteins does not allow understanding and modeling of inherited processes such as inborn coordinated motions of living bodies, innate principles of sensory information processing, quasi-holographic properties, etc. To be able to model these phenomena, the concept of geno-logical coding, which is connected with logical functions and Boolean algebra, is put forward. The article describes basic pieces of evidence in favor of the existence of the geno-logical code, which exists in p­arallel with the known genetic code of amino acid sequences but which serves for transferring inherited processes along chains of generations. These pieces of evidence have been received due to the analysis of symmetries in structures of molecular-genetic systems. The analysis has revealed a close connection of the genetic system with dyadic groups of binary numbers and with other mathematical objects, which are related with dyadic groups: Walsh functions (which are algebraic characters of dyadic groups, bit-reversal permutations, logical holography, etc. These results provide a new approach for mathematical modeling of genetic structures, which uses known mathematical formalisms from technological fields of noise-immunity coding of information, binary analysis, logical holography, and digital devices of artificial intellect. Some opportunities for a development of algebraic-logical biology are opened.

  20. The Graph, Geometry and Symmetries of the Genetic Code with Hamming Metric

    Directory of Open Access Journals (Sweden)

    Reijer Lenstra

    2015-07-01

    Full Text Available The similarity patterns of the genetic code result from similar codons encoding similar messages. We develop a new mathematical model to analyze these patterns. The physicochemical characteristics of amino acids objectively quantify their differences and similarities; the Hamming metric does the same for the 64 codons of the codon set. (Hamming distances equal the number of different codon positions: AAA and AAC are at 1-distance; codons are maximally at 3-distance. The CodonPolytope, a 9-dimensional geometric object, is spanned by 64 vertices that represent the codons and the Euclidian distances between these vertices correspond one-to-one with intercodon Hamming distances. The CodonGraph represents the vertices and edges of the polytope; each edge equals a Hamming 1-distance. The mirror reflection symmetry group of the polytope is isomorphic to the largest permutation symmetry group of the codon set that preserves Hamming distances. These groups contain 82,944 symmetries. Many polytope symmetries coincide with the degeneracy and similarity patterns of the genetic code. These code symmetries are strongly related with the face structure of the polytope with smaller faces displaying stronger code symmetries. Splitting the polytope stepwise into smaller faces models an early evolution of the code that generates this hierarchy of code symmetries. The canonical code represents a class of 41,472 codes with equivalent symmetries; a single class among an astronomical number of symmetry classes comprising all possible codes.

  1. Synthetic alienation of microbial organisms by using genetic code engineering: Why and how?

    Science.gov (United States)

    Kubyshkin, Vladimir; Budisa, Nediljko

    2017-08-01

    The main goal of synthetic biology (SB) is the creation of biodiversity applicable for biotechnological needs, while xenobiology (XB) aims to expand the framework of natural chemistries with the non-natural building blocks in living cells to accomplish artificial biodiversity. Protein and proteome engineering, which overcome limitation of the canonical amino acid repertoire of 20 (+2) prescribed by the genetic code by using non-canonic amino acids (ncAAs), is one of the main focuses of XB research. Ideally, estranging the genetic code from its current form via systematic introduction of ncAAs should enable the development of bio-containment mechanisms in synthetic cells potentially endowing them with a "genetic firewall" i.e. orthogonality which prevents genetic information transfer to natural systems. Despite rapid progress over the past two decades, it is not yet possible to completely alienate an organism that would use and maintain different genetic code associations permanently. In order to engineer robust bio-contained life forms, the chemical logic behind the amino acid repertoire establishment should be considered. Starting from recent proposal of Hartman and Smith about the genetic code establishment in the RNA world, here the authors mapped possible biotechnological invasion points for engineering of bio-contained synthetic cells equipped with non-canonical functionalities. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The Impact of Diagnostic Code Misclassification on Optimizing the Experimental Design of Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Steven J. Schrodi

    2017-01-01

    Full Text Available Diagnostic codes within electronic health record systems can vary widely in accuracy. It has been noted that the number of instances of a particular diagnostic code monotonically increases with the accuracy of disease phenotype classification. As a growing number of health system databases become linked with genomic data, it is critically important to understand the effect of this misclassification on the power of genetic association studies. Here, I investigate the impact of this diagnostic code misclassification on the power of genetic association studies with the aim to better inform experimental designs using health informatics data. The trade-off between (i reduced misclassification rates from utilizing additional instances of a diagnostic code per individual and (ii the resulting smaller sample size is explored, and general rules are presented to improve experimental designs.

  3. A Novel Real-coded Quantum-inspired Genetic Algorithm and Its Application in Data Reconciliation

    Directory of Open Access Journals (Sweden)

    Gao Lin

    2012-06-01

    Full Text Available Traditional quantum-inspired genetic algorithm (QGA has drawbacks such as premature convergence, heavy computational cost, complicated coding and decoding process etc. In this paper, a novel real-coded quantum-inspired genetic algorithm is proposed based on interval division thinking. Detailed comparisons with some similar approaches for some standard benchmark functions test validity of the proposed algorithm. Besides, the proposed algorithm is used in two typical nonlinear data reconciliation problems (distilling process and extraction process and simulation results show its efficiency in nonlinear data reconciliation problems.

  4. Open Genetic Code: on open source in the life sciences.

    Science.gov (United States)

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  5. Frozen Accident Pushing 50: Stereochemistry, Expansion, and Chance in the Evolution of the Genetic Code.

    Science.gov (United States)

    Koonin, Eugene V

    2017-05-23

    Nearly 50 years ago, Francis Crick propounded the frozen accident scenario for the evolution of the genetic code along with the hypothesis that the early translation system consisted primarily of RNA. Under the frozen accident perspective, the code is universal among modern life forms because any change in codon assignment would be highly deleterious. The frozen accident can be considered the default theory of code evolution because it does not imply any specific interactions between amino acids and the cognate codons or anticodons, or any particular properties of the code. The subsequent 49 years of code studies have elucidated notable features of the standard code, such as high robustness to errors, but failed to develop a compelling explanation for codon assignments. In particular, stereochemical affinity between amino acids and the cognate codons or anticodons does not seem to account for the origin and evolution of the code. Here, I expand Crick's hypothesis on RNA-only translation system by presenting evidence that this early translation already attained high fidelity that allowed protein evolution. I outline an experimentally testable scenario for the evolution of the code that combines a distinct version of the stereochemical hypothesis, in which amino acids are recognized via unique sites in the tertiary structure of proto-tRNAs, rather than by anticodons, expansion of the code via proto-tRNA duplication, and the frozen accident.

  6. [Direct genetic manipulation and criminal code in Venezuela: absolute criminal law void?].

    Science.gov (United States)

    Cermeño Zambrano, Fernando G De J

    2002-01-01

    The judicial regulation of genetic biotechnology applied to the human genome is of big relevance currently in Venezuela due to the drafting of an innovative bioethical law in the country's parliament. This article will highlight the constitutional normative of Venezuela's 1999 Constitution regarding this subject, as it establishes the framework from which this matter will be legally regulated. The approach this article makes towards the genetic biotechnology applied to the human genome is made taking into account the Venezuelan penal law and by highlighting the violent genetic manipulations that have criminal relevance. The genetic biotechnology applied to the human genome has another important relevance as a consequence of the reformulation of the Venezuelan Penal Code discussed by the country's National Assembly. Therefore, a concise study of the country's penal code will be made in this article to better understand what judicial-penal properties have been protected by the Venezuelan penal legislation. This last step will enable us to identify the penal tools Venezuela counts on to face direct genetic manipulations. We will equally indicate the existing punitive loophole and that should be covered by the penal legislator. In conclusion, this essay concerns criminal policy, referred to the direct genetic manipulations on the human genome that haven't been typified in Venezuelan law, thus discovering a genetic biotechnology paradise.

  7. Unassigned Codons, Nonsense Suppression, and Anticodon Modifications in the Evolution of the Genetic Code

    NARCIS (Netherlands)

    P.T.S. van der Gulik (Peter); W.D. Hoff (Wouter)

    2011-01-01

    htmlabstractThe origin of the genetic code is a central open problem regarding the early evolution of life. Here, we consider two undeveloped but important aspects of possible scenarios for the evolutionary pathway of the translation machinery: the role of unassigned codons in early stages

  8. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  9. System level ESD protection

    CERN Document Server

    Vashchenko, Vladislav

    2014-01-01

    This book addresses key aspects of analog integrated circuits and systems design related to system level electrostatic discharge (ESD) protection.  It is an invaluable reference for anyone developing systems-on-chip (SoC) and systems-on-package (SoP), integrated with system-level ESD protection. The book focuses on both the design of semiconductor integrated circuit (IC) components with embedded, on-chip system level protection and IC-system co-design. The readers will be enabled to bring the system level ESD protection solutions to the level of integrated circuits, thereby reducing or completely eliminating the need for additional, discrete components on the printed circuit board (PCB) and meeting system-level ESD requirements. The authors take a systematic approach, based on IC-system ESD protection co-design. A detailed description of the available IC-level ESD testing methods is provided, together with a discussion of the correlation between IC-level and system-level ESD testing methods. The IC-level ESD...

  10. Towards A Genetic Business Code For Growth in the South African Transport Industry

    Directory of Open Access Journals (Sweden)

    J.H. Vermeulen

    2003-11-01

    Full Text Available As with each living organism, it is proposed that an organisation possesses a genetic code. In the fast-changing business environment it would be invaluable to know what constitutes organisational growth and success in terms of such a code. To identify this genetic code a quantitative methodological framework, supplemented by a qualitative approach, was used and the views of top management in the Transport Industry were solicited. The Repertory Grid was used as the primary data-collection method. Through a phased data-analysis process an integrated profile of first- and second-order constructs, and opposite poles, was compiled. By utilising deductive and inductive strategies three strands of a Genetic Business Growth Code were identified, namely a Leadership Strand, Organisational Architecture Strand and Internal Orientation Strand. The study confirmed the value of a Genetic Business Code for growth in the Transport Industry. Opsomming Daar word voorgestel dat ’n organisasie, soos elke lewende organisme, oor ’n genetiese kode beskik. In die snelveranderende sake-omgewing sal dit onskatbaar wees om te weet wat organisasiegroei en –sukses veroorsaak. ’n Kwantitatiewe metodologie-raamwerk, aangevul deur ’n kwalitatiewe benadering is gebruik om hierdie genetiese kode te identifiseer, en die menings van topbestuur in die Vervoerbedryf is ingewin met behulp van die “Repertory Grid" as die vernaamste metode van data-insameling. ’n Geïntegreerde profiel van eerste- en tweedeordekonstrukte, met hulle teenoorgestelde pole, is opgestel. Drie stringe van ’n Genetiese Sakegroeikode, nl. ’n Leierskapstring, die Organisasieargitektuur-string en die Innerlike-ingesteldheidstring is geïdentifiseer deur deduktiewe en induktiewe strategieë te gebruik. Die studie bevestig die waarde van ’n Genetiese Sakekode vir groei in die Vervoerbedryf.

  11. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  12. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  13. Interdependence, Reflexivity, Fidelity, Impedance Matching, and the Evolution of Genetic Coding

    Science.gov (United States)

    Carter, Charles W; Wills, Peter R

    2018-01-01

    Abstract Genetic coding is generally thought to have required ribozymes whose functions were taken over by polypeptide aminoacyl-tRNA synthetases (aaRS). Two discoveries about aaRS and their interactions with tRNA substrates now furnish a unifying rationale for the opposite conclusion: that the key processes of the Central Dogma of molecular biology emerged simultaneously and naturally from simple origins in a peptide•RNA partnership, eliminating the epistemological utility of a prior RNA world. First, the two aaRS classes likely arose from opposite strands of the same ancestral gene, implying a simple genetic alphabet. The resulting inversion symmetries in aaRS structural biology would have stabilized the initial and subsequent differentiation of coding specificities, rapidly promoting diversity in the proteome. Second, amino acid physical chemistry maps onto tRNA identity elements, establishing reflexive, nanoenvironmental sensing in protein aaRS. Bootstrapping of increasingly detailed coding is thus intrinsic to polypeptide aaRS, but impossible in an RNA world. These notions underline the following concepts that contradict gradual replacement of ribozymal aaRS by polypeptide aaRS: 1) aaRS enzymes must be interdependent; 2) reflexivity intrinsic to polypeptide aaRS production dynamics promotes bootstrapping; 3) takeover of RNA-catalyzed aminoacylation by enzymes will necessarily degrade specificity; and 4) the Central Dogma’s emergence is most probable when replication and translation error rates remain comparable. These characteristics are necessary and sufficient for the essentially de novo emergence of a coupled gene–replicase–translatase system of genetic coding that would have continuously preserved the functional meaning of genetically encoded protein genes whose phylogenetic relationships match those observed today. PMID:29077934

  14. Analysis of genetic code ambiguity arising from nematode-specific misacylated tRNAs.

    Directory of Open Access Journals (Sweden)

    Kiyofumi Hamashima

    Full Text Available The faithful translation of the genetic code requires the highly accurate aminoacylation of transfer RNAs (tRNAs. However, it has been shown that nematode-specific V-arm-containing tRNAs (nev-tRNAs are misacylated with leucine in vitro in a manner that transgresses the genetic code. nev-tRNA(Gly (CCC and nev-tRNA(Ile (UAU, which are the major nev-tRNA isotypes, could theoretically decode the glycine (GGG codon and isoleucine (AUA codon as leucine, causing GGG and AUA codon ambiguity in nematode cells. To test this hypothesis, we investigated the functionality of nev-tRNAs and their impact on the proteome of Caenorhabditis elegans. Analysis of the nucleotide sequences in the 3' end regions of the nev-tRNAs showed that they had matured correctly, with the addition of CCA, which is a crucial posttranscriptional modification required for tRNA aminoacylation. The nuclear export of nev-tRNAs was confirmed with an analysis of their subcellular localization. These results show that nev-tRNAs are processed to their mature forms like common tRNAs and are available for translation. However, a whole-cell proteome analysis found no detectable level of nev-tRNA-induced mistranslation in C. elegans cells, suggesting that the genetic code is not ambiguous, at least under normal growth conditions. Our findings indicate that the translational fidelity of the nematode genetic code is strictly maintained, contrary to our expectations, although deviant tRNAs with misacylation properties are highly conserved in the nematode genome.

  15. Systems-level approaches reveal conservation of trans-regulated genes in the rat and genetic determinants of blood pressure in humans

    Czech Academy of Sciences Publication Activity Database

    Langley, S. R.; Bottolo, L.; Kuneš, Jaroslav; Zicha, Josef; Zídek, Václav; Hubner, N.; Cook, S.A.; Pravenec, Michal; Aitman, T. J.; Petretto, E.

    2013-01-01

    Roč. 97, č. 4 (2013), s. 653-665 ISSN 0008-6363 R&D Projects: GA MŠk(CZ) LH11049; GA MŠk(CZ) LL1204; GA MŠk(CZ) 7E10067 Institutional support: RVO:67985823 Keywords : integrative genomics * expression QTLs * time series analysis * trans-acting regulation * genome-wide association studies Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 5.808, year: 2013

  16. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  17. Inclusion of the fitness sharing technique in an evolutionary algorithm to analyze the fitness landscape of the genetic code adaptability.

    Science.gov (United States)

    Santos, José; Monteagudo, Ángel

    2017-03-27

    The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.

  18. A novel nuclear genetic code alteration in yeasts and the evolution of codon reassignment in eukaryotes.

    Science.gov (United States)

    Mühlhausen, Stefanie; Findeisen, Peggy; Plessmann, Uwe; Urlaub, Henning; Kollmar, Martin

    2016-07-01

    The genetic code is the cellular translation table for the conversion of nucleotide sequences into amino acid sequences. Changes to the meaning of sense codons would introduce errors into almost every translated message and are expected to be highly detrimental. However, reassignment of single or multiple codons in mitochondria and nuclear genomes, although extremely rare, demonstrates that the code can evolve. Several models for the mechanism of alteration of nuclear genetic codes have been proposed (including "codon capture," "genome streamlining," and "ambiguous intermediate" theories), but with little resolution. Here, we report a novel sense codon reassignment in Pachysolen tannophilus, a yeast related to the Pichiaceae. By generating proteomics data and using tRNA sequence comparisons, we show that Pachysolen translates CUG codons as alanine and not as the more usual leucine. The Pachysolen tRNACAG is an anticodon-mutated tRNA(Ala) containing all major alanine tRNA recognition sites. The polyphyly of the CUG-decoding tRNAs in yeasts is best explained by a tRNA loss driven codon reassignment mechanism. Loss of the CUG-tRNA in the ancient yeast is followed by gradual decrease of respective codons and subsequent codon capture by tRNAs whose anticodon is not part of the aminoacyl-tRNA synthetase recognition region. Our hypothesis applies to all nuclear genetic code alterations and provides several testable predictions. We anticipate more codon reassignments to be uncovered in existing and upcoming genome projects. © 2016 Mühlhausen et al.; Published by Cold Spring Harbor Laboratory Press.

  19. Advanced Design of Dumbbell-shaped Genetic Minimal Vectors Improves Non-coding and Coding RNA Expression.

    Science.gov (United States)

    Jiang, Xiaoou; Yu, Han; Teo, Cui Rong; Tan, Genim Siu Xian; Goh, Sok Chin; Patel, Parasvi; Chua, Yiqiang Kevin; Hameed, Nasirah Banu Sahul; Bertoletti, Antonio; Patzel, Volker

    2016-09-01

    Dumbbell-shaped DNA minimal vectors lacking nontherapeutic genes and bacterial sequences are considered a stable, safe alternative to viral, nonviral, and naked plasmid-based gene-transfer systems. We investigated novel molecular features of dumbbell vectors aiming to reduce vector size and to improve the expression of noncoding or coding RNA. We minimized small hairpin RNA (shRNA) or microRNA (miRNA) expressing dumbbell vectors in size down to 130 bp generating the smallest genetic expression vectors reported. This was achieved by using a minimal H1 promoter with integrated transcriptional terminator transcribing the RNA hairpin structure around the dumbbell loop. Such vectors were generated with high conversion yields using a novel protocol. Minimized shRNA-expressing dumbbells showed accelerated kinetics of delivery and transcription leading to enhanced gene silencing in human tissue culture cells. In primary human T cells, minimized miRNA-expressing dumbbells revealed higher stability and triggered stronger target gene suppression as compared with plasmids and miRNA mimics. Dumbbell-driven gene expression was enhanced up to 56- or 160-fold by implementation of an intron and the SV40 enhancer compared with control dumbbells or plasmids. Advanced dumbbell vectors may represent one option to close the gap between durable expression that is achievable with integrating viral vectors and short-term effects triggered by naked RNA.

  20. An Order Coding Genetic Algorithm to Optimize Fuel Reloads in a Nuclear Boiling Water Reactor

    International Nuclear Information System (INIS)

    Ortiz, Juan Jose; Requena, Ignacio

    2004-01-01

    A genetic algorithm is used to optimize the nuclear fuel reload for a boiling water reactor, and an order coding is proposed for the chromosomes and appropriate crossover and mutation operators. The fitness function was designed so that the genetic algorithm creates fuel reloads that, on one hand, satisfy the constrictions for the radial power peaking factor, the minimum critical power ratio, and the maximum linear heat generation rate while optimizing the effective multiplication factor at the beginning and end of the cycle. To find the values of these variables, a neural network trained with the behavior of a reactor simulator was used to predict them. The computation time is therefore greatly decreased in the search process. We validated this method with data from five cycles of the Laguna Verde Nuclear Power Plant in Mexico

  1. The standard genetic code and its relation to mutational pressure: robustness and equilibrium criteria

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Martinez Ortiz, Carlos; Sautie Castellanos, Miguel; Valdes, Kiria; Guevara Erra, Ramon

    2004-10-01

    Under the assumption of even point mutation pressure on the DNA strand, rates for transitions from one amino acid into another were assessed. Nearly 25% of all mutations were silent. About 48% of the mutations from a given amino acid stream either into the same amino acid or into an amino acid of the same class. These results suggest a great stability of the Standard Genetic Code respect to mutation load. Concepts from chemical equilibrium theory are applicable into this case provided that mutation rate constants are given. It was obtained that unequal synonymic codon usage may lead to changes in the equilibrium concentrations. Data from real biological species showed that several amino acids are close to the respective equilibrium concentration. However in all the cases the concentration of leucine nearly doubled its equilibrium concentration, whereas for the stop command (Term) it was about 10 times lower. The overall distance from equilibrium for a set of species suggests that eukaryotes are closer to equilibrium than prokaryotes, and the HIV virus was closest to equilibrium among 15 species. We obtained that contemporary species are closer to the equilibrium than the Last Universal Common Ancestor (LUCA) was. Similarly, nonpreserved regions in proteins are closer to equilibrium than the preserved ones. We suggest that this approach can be useful for exploring some aspects of biological evolution in the framework of Standard Genetic Code properties. (author)

  2. A nuclear reload optimization approach using a real coded genetic algorithm with random keys

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    The fuel reload of a Pressurized Water Reactor is made whenever the burn up of the fuel assemblies in the nucleus of the reactor reaches a certain value such that it is not more possible to maintain a critical reactor producing energy at nominal power. The problem of fuel reload optimization consists on determining the positioning of the fuel assemblies within the nucleus of the reactor in an optimized way to minimize the cost benefit relationship of fuel assemblies cost per maximum burn up, and also satisfying symmetry and safety restrictions. The fuel reload optimization problem difficulty grows exponentially with the number of fuel assemblies in the nucleus of the reactor. During decades the fuel reload optimization problem was solved manually by experts that used their knowledge and experience to build configurations of the reactor nucleus, and testing them to verify if safety restrictions of the plant are satisfied. To reduce this burden, several optimization techniques have been used, included the binary code genetic algorithm. In this work we show the use of a real valued coded approach of the genetic algorithm, with different recombination methods, together with a transformation mechanism called random keys, to transform the real values of the genes of each chromosome in a combination of discrete fuel assemblies for evaluation of the reload optimization. Four different recombination methods were tested: discrete recombination, intermediate recombination, linear recombination and extended linear recombination. For each of the 4 recombination methods 10 different tests using different seeds for the random number generator were conducted 10 generating, totaling 40 tests. The results of the application of the genetic algorithm are shown with formulation of real numbers for the problem of the nuclear reload of the plant Angra 1 type PWR. Since the best results in the literature for this problem were found by the parallel PSO we will it use for comparison

  3. Genetic coding and united-hypercomplex systems in the models of algebraic biology.

    Science.gov (United States)

    Petoukhov, Sergey V

    2017-08-01

    Structured alphabets of DNA and RNA in their matrix form of representations are connected with Walsh functions and a new type of systems of multidimensional numbers. This type generalizes systems of complex numbers and hypercomplex numbers, which serve as the basis of mathematical natural sciences and many technologies. The new systems of multi-dimensional numbers have interesting mathematical properties and are called in a general case as "systems of united-hypercomplex numbers" (or briefly "U-hypercomplex numbers"). They can be widely used in models of multi-parametrical systems in the field of algebraic biology, artificial life, devices of biological inspired artificial intelligence, etc. In particular, an application of U-hypercomplex numbers reveals hidden properties of genetic alphabets under cyclic permutations in their doublets and triplets. A special attention is devoted to the author's hypothesis about a multi-linguistic in DNA-sequences in a relation with an ensemble of U-numerical sub-alphabets. Genetic multi-linguistic is considered as an important factor to provide noise-immunity properties of the multi-channel genetic coding. Our results attest to the conformity of the algebraic properties of the U-numerical systems with phenomenological properties of the DNA-alphabets and with the complementary device of the double DNA-helix. It seems that in the modeling field of algebraic biology the genetic-informational organization of living bodies can be considered as a set of united-hypercomplex numbers in some association with the famous slogan of Pythagoras "the numbers rule the world". Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Photoactivatable Mussel-Based Underwater Adhesive Proteins by an Expanded Genetic Code.

    Science.gov (United States)

    Hauf, Matthias; Richter, Florian; Schneider, Tobias; Faidt, Thomas; Martins, Berta M; Baumann, Tobias; Durkin, Patrick; Dobbek, Holger; Jacobs, Karin; Möglich, Andreas; Budisa, Nediljko

    2017-09-19

    Marine mussels exhibit potent underwater adhesion abilities under hostile conditions by employing 3,4-dihydroxyphenylalanine (DOPA)-rich mussel adhesive proteins (MAPs). However, their recombinant production is a major biotechnological challenge. Herein, a novel strategy based on genetic code expansion has been developed by engineering efficient aminoacyl-transfer RNA synthetases (aaRSs) for the photocaged noncanonical amino acid ortho-nitrobenzyl DOPA (ONB-DOPA). The engineered ONB-DOPARS enables in vivo production of MAP type 5 site-specifically equipped with multiple instances of ONB-DOPA to yield photocaged, spatiotemporally controlled underwater adhesives. Upon exposure to UV light, these proteins feature elevated wet adhesion properties. This concept offers new perspectives for the production of recombinant bioadhesives. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. From chemical metabolism to life: the origin of the genetic coding process

    Directory of Open Access Journals (Sweden)

    Antoine Danchin

    2017-06-01

    Full Text Available Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life.

  6. Chromatin remodeling: the interface between extrinsic cues and the genetic code?

    Science.gov (United States)

    Ezzat, Shereen

    2008-10-01

    The successful completion of the human genome project ushered a new era of hope and skepticism. However, the promise of finding the fundamental basis of human traits and diseases appears less than fulfilled. The original premise was that the DNA sequence of every gene would allow precise characterization of critical differences responsible for altered cellular functions. The characterization of intragenic mutations in cancers paved the way for early screening and the design of targeted therapies. However, it has also become evident that unmasking genetic codes alone cannot explain the diversity of disease phenotypes within a population. Further, classic genetics has not been able to explain the differences that have been observed among identical twins or even cloned animals. This new reality has re-ignited interest in the field of epigenetics. While traditionally defined as heritable changes that can alter gene expression without affecting the corresponding DNA sequence, this definition has come into question. The extent to which epigenetic change can also be acquired in response to chemical stimuli represents an exciting dimension in the "nature vs nurture" debate. In this review I will describe a series of studies in my laboratory that illustrate the significance of epigenetics and its potential clinical implications.

  7. Use of fluorescent proteins and color-coded imaging to visualize cancer cells with different genetic properties.

    Science.gov (United States)

    Hoffman, Robert M

    2016-03-01

    Fluorescent proteins are very bright and available in spectrally-distinct colors, enable the imaging of color-coded cancer cells growing in vivo and therefore the distinction of cancer cells with different genetic properties. Non-invasive and intravital imaging of cancer cells with fluorescent proteins allows the visualization of distinct genetic variants of cancer cells down to the cellular level in vivo. Cancer cells with increased or decreased ability to metastasize can be distinguished in vivo. Gene exchange in vivo which enables low metastatic cancer cells to convert to high metastatic can be color-coded imaged in vivo. Cancer stem-like and non-stem cells can be distinguished in vivo by color-coded imaging. These properties also demonstrate the vast superiority of imaging cancer cells in vivo with fluorescent proteins over photon counting of luciferase-labeled cancer cells.

  8. Maximization Network Throughput Based on Improved Genetic Algorithm and Network Coding for Optical Multicast Networks

    Science.gov (United States)

    Wei, Chengying; Xiong, Cuilian; Liu, Huanlin

    2017-12-01

    Maximal multicast stream algorithm based on network coding (NC) can improve the network's throughput for wavelength-division multiplexing (WDM) networks, which however is far less than the network's maximal throughput in terms of theory. And the existing multicast stream algorithms do not give the information distribution pattern and routing in the meantime. In the paper, an improved genetic algorithm is brought forward to maximize the optical multicast throughput by NC and to determine the multicast stream distribution by hybrid chromosomes construction for multicast with single source and multiple destinations. The proposed hybrid chromosomes are constructed by the binary chromosomes and integer chromosomes, while the binary chromosomes represent optical multicast routing and the integer chromosomes indicate the multicast stream distribution. A fitness function is designed to guarantee that each destination can receive the maximum number of decoding multicast streams. The simulation results showed that the proposed method is far superior over the typical maximal multicast stream algorithms based on NC in terms of network throughput in WDM networks.

  9. Physicochemical basis for the origin of the genetic code - Lecture 3

    International Nuclear Information System (INIS)

    Ponnamperuma, C.

    1992-01-01

    A study of the association of homocodonic amino acids and selected heterocodonic amino acids with selected nucleotides in aqueous solution was undertaken to examine a possible physical basis for the origin of codon assignments. These interactions were studied using 1H nuclear magnetic resonance spectroscopy (NMR). Association constants for the various interactions were determined by fitting the changes in the chemical shifts of the anomeric and ring protons of the nucleoside moieties as a function of amino acid concentration to an isotherm which described the binding interaction. The strongest association of all homocodonic amino acids were with their respective anticodonic nucleotide sequences. The strength of association was seen to increase with increase in the chain length of the anticodonic nucleotide. The association of these amino acids with different phosphate esters of nucleotides suggests that a definite isomeric structure is required for association with a specified amino acid; the 5'-mononucleotides and (3'-5')-linked dinucleotides are the favored geometries for strong associations. Use of heterocodonic amino acids and nonprotein amino acids supports these findings. We conclude that there is at least a physicochemical, anticodonic contribution to the origin of the genetic code. (author)

  10. Enhancement of combined heat and power economic dispatch using self adaptive real-coded genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Subbaraj, P. [Kalasalingam University, Srivilliputhur, Tamilnadu 626 190 (India); Rengaraj, R. [Electrical and Electronics Engineering, S.S.N. College of Engineering, Old Mahabalipuram Road, Thirupporur (T.K), Kalavakkam, Kancheepuram (Dist.) 603 110, Tamilnadu (India); Salivahanan, S. [S.S.N. College of Engineering, Old Mahabalipuram Road, Thirupporur (T.K), Kalavakkam, Kancheepuram (Dist.) 603 110, Tamilnadu (India)

    2009-06-15

    In this paper, a self adaptive real-coded genetic algorithm (SARGA) is implemented to solve the combined heat and power economic dispatch (CHPED) problem. The self adaptation is achieved by means of tournament selection along with simulated binary crossover (SBX). The selection process has a powerful exploration capability by creating tournaments between two solutions. The better solution is chosen and placed in the mating pool leading to better convergence and reduced computational burden. The SARGA integrates penalty parameterless constraint handling strategy and simultaneously handles equality and inequality constraints. The population diversity is introduced by making use of distribution index in SBX operator to create a better offspring. This leads to a high diversity in population which can increase the probability towards the global optimum and prevent premature convergence. The SARGA is applied to solve CHPED problem with bounded feasible operating region which has large number of local minima. The numerical results demonstrate that the proposed method can find a solution towards the global optimum and compares favourably with other recent methods in terms of solution quality, handling constraints and computation time. (author)

  11. A Stress-Induced Bias in the Reading of the Genetic Code in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Adi Oron-Gottesman

    2016-11-01

    Full Text Available Escherichia coli mazEF is an extensively studied stress-induced toxin-antitoxin (TA system. The toxin MazF is an endoribonuclease that cleaves RNAs at ACA sites. Thereby, under stress, the induced MazF generates a stress-induced translation machinery (STM, composed of MazF-processed mRNAs and selective ribosomes that specifically translate the processed mRNAs. Here, we further characterized the STM system, finding that MazF cleaves only ACA sites located in the open reading frames of processed mRNAs, while out-of-frame ACAs are resistant. This in-frame ACA cleavage of MazF seems to depend on MazF binding to an extracellular-death-factor (EDF-like element in ribosomal protein bS1 (bacterial S1, apparently causing MazF to be part of STM ribosomes. Furthermore, due to the in-frame MazF cleavage of ACAs under stress, a bias occurs in the reading of the genetic code causing the amino acid threonine to be encoded only by its synonym codon ACC, ACU, or ACG, instead of by ACA.

  12. The role of crossover operator in evolutionary-based approach to the problem of genetic code optimization.

    Science.gov (United States)

    Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł

    2016-12-01

    One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure

  13. Orion: Detecting regions of the human non-coding genome that are intolerant to variation using population genetics.

    Science.gov (United States)

    Gussow, Ayal B; Copeland, Brett R; Dhindsa, Ryan S; Wang, Quanli; Petrovski, Slavé; Majoros, William H; Allen, Andrew S; Goldstein, David B

    2017-01-01

    There is broad agreement that genetic mutations occurring outside of the protein-coding regions play a key role in human disease. Despite this consensus, we are not yet capable of discerning which portions of non-coding sequence are important in the context of human disease. Here, we present Orion, an approach that detects regions of the non-coding genome that are depleted of variation, suggesting that the regions are intolerant of mutations and subject to purifying selection in the human lineage. We show that Orion is highly correlated with known intolerant regions as well as regions that harbor putatively pathogenic variation. This approach provides a mechanism to identify pathogenic variation in the human non-coding genome and will have immediate utility in the diagnostic interpretation of patient genomes and in large case control studies using whole-genome sequences.

  14. Quantum Genetics in terms of Quantum Reversible Automata and Quantum Computation of Genetic Codes and Reverse Transcription

    CERN Document Server

    Baianu,I C

    2004-01-01

    The concepts of quantum automata and quantum computation are studied in the context of quantum genetics and genetic networks with nonlinear dynamics. In previous publications (Baianu,1971a, b) the formal concept of quantum automaton and quantum computation, respectively, were introduced and their possible implications for genetic processes and metabolic activities in living cells and organisms were considered. This was followed by a report on quantum and abstract, symbolic computation based on the theory of categories, functors and natural transformations (Baianu,1971b; 1977; 1987; 2004; Baianu et al, 2004). The notions of topological semigroup, quantum automaton, or quantum computer, were then suggested with a view to their potential applications to the analogous simulation of biological systems, and especially genetic activities and nonlinear dynamics in genetic networks. Further, detailed studies of nonlinear dynamics in genetic networks were carried out in categories of n-valued, Lukasiewicz Logic Algebra...

  15. The aminoacyl-tRNA synthetases had only a marginal role in the origin of the organization of the genetic code: Evidence in favor of the coevolution theory.

    Science.gov (United States)

    Di Giulio, Massimo

    2017-11-07

    The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal

  16. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    International Nuclear Information System (INIS)

    Binh, Do Quang; Huy, Ngo Quang; Hai, Nguyen Hoang

    2014-01-01

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  17. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    Energy Technology Data Exchange (ETDEWEB)

    Binh, Do Quang [University of Technical Education Ho Chi Minh City (Viet Nam); Huy, Ngo Quang [University of Industry Ho Chi Minh City (Viet Nam); Hai, Nguyen Hoang [Centre for Research and Development of Radiation Technology, Ho Chi Minh City (Viet Nam)

    2014-12-15

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  18. FitSKIRT: genetic algorithms to automatically fit dusty galaxies with a Monte Carlo radiative transfer code

    Science.gov (United States)

    De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.

    2013-02-01

    We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.

  19. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  20. Space elevator systems level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Laubscher, B. E. (Bryan E.)

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  1. Breaking the code: Statistical methods and methodological issues in psychiatric genetics

    NARCIS (Netherlands)

    Stringer, S.

    2015-01-01

    The genome-wide association (GWA) era has confirmed the heritability of many psychiatric disorders, most notably schizophrenia. Thousands of genetic variants with individually small effect sizes cumulatively constitute a large contribution to the heritability of psychiatric disorders. This thesis

  2. [Assisted reproduction and artificial insemination and genetic manipulation in the Criminal Code of the Federal District, Mexico].

    Science.gov (United States)

    Brena Sesma, Ingrid

    2004-01-01

    The article that one presents has for purpose outline and comment on the recent modifications to the Penal Code for the Federal District of México which establish, for the first time, crimes related to the artificial procreation and to the genetic manipulation. Also one refers to the interaction of the new legal texts with the sanitary legislation of the country. Since it will be stated in some cases they present confrontations between the penal and the sanitary reglamentation and some points related to the legality or unlawfulness of a conduct that stayed without the enough development. These lacks will complicate the application of the new rules of the Penal Code of the Federal District.

  3. Genetic variants in long non-coding RNA MIAT contribute to risk of paranoid schizophrenia in a Chinese Han population.

    Science.gov (United States)

    Rao, Shu-Quan; Hu, Hui-Ling; Ye, Ning; Shen, Yan; Xu, Qi

    2015-08-01

    The heritability of schizophrenia has been reported to be as high as ~80%, but the contribution of genetic variants identified to this heritability remains to be estimated. Long non-coding RNAs (LncRNAs) are involved in multiple processes critical to normal cellular function and dysfunction of lncRNA MIAT may contribute to the pathophysiology of schizophrenia. However, the genetic evidence of lncRNAs involved in schizophrenia has not been documented. Here, we conducted a two-stage association analysis on 8 tag SNPs that cover the whole MIAT locus in two independent Han Chinese schizophrenia case-control cohorts (discovery sample from Shanxi Province: 1093 patients with paranoid schizophrenia and 1180 control subjects; replication cohort from Jilin Province: 1255 cases and 1209 healthy controls). In discovery stage, significant genetic association with paranoid schizophrenia was observed for rs1894720 (χ(2)=74.20, P=7.1E-18), of which minor allele (T) had an OR of 1.70 (95% CI=1.50-1.91). This association was confirmed in the replication cohort (χ(2)=22.66, P=1.9E-06, OR=1.32, 95%CI 1.18-1.49). Besides, a weak genotypic association was detected for rs4274 (χ(2)=4.96, df=2, P=0.03); the AA carriers showed increased disease risk (OR=1.30, 95%CI=1.03-1.64). No significant association was found between any haplotype and paranoid schizophrenia. The present studies showed that lncRNA MIAT was a novel susceptibility gene for paranoid schizophrenia in the Chinese Han population. Considering that most lncRNAs locate in non-coding regions, our result may explain why most susceptibility loci for schizophrenia identified by genome wide association studies were out of coding regions. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. An enhancement of selection and crossover operations in real-coded genetic algorithm for large-dimensionality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Noh Sung; Lee, Jongsoo [Yonsei University, Seoul (Korea, Republic of)

    2016-01-15

    The present study aims to implement a new selection method and a novel crossover operation in a real-coded genetic algorithm. The proposed selection method facilitates the establishment of a successively evolved population by combining several subpopulations: an elitist subpopulation, an off-spring subpopulation and a mutated subpopulation. A probabilistic crossover is performed based on the measure of probabilistic distance between the individuals. The concept of ‘allowance’ is suggested to describe the level of variance in the crossover operation. A number of nonlinear/non-convex functions and engineering optimization problems are explored to verify the capacities of the proposed strategies. The results are compared with those obtained from other genetic and nature-inspired algorithms.

  5. System-level musings about system-level science (Invited)

    Science.gov (United States)

    Liu, W.

    2009-12-01

    In teleology, a system has a purpose. In physics, a system has a tendency. For example, a mechanical system has a tendency to lower its potential energy. A thermodynamic system has a tendency to increase its entropy. Therefore, if geospace is seen as a system, what is its tendency? Surprisingly or not, there is no simple answer to this question. Or, to flip the statement, the answer is complex, or complexity. We can understand generally why complexity arises, as the geospace boundary is open to influences from the solar wind and Earth’s atmosphere and components of the system couple to each other in a myriad of ways to make the systemic behavior highly nonlinear. But this still begs the question: What is the system-level approach to geospace science? A reductionist view might assert that as our understanding of a component or subsystem progresses to a certain point, we can couple some together to understand the system on a higher level. However, in practice, a subsystem can almost never been observed in isolation with others. Even if such is possible, there is no guarantee that the subsystem behavior will not change when coupled to others. Hence, there is no guarantee that a subsystem, such as the ring current, has an innate and intrinsic behavior like a hydrogen atom. An absolutist conclusion from this logic can be sobering, as one would have to trace a flash of aurora to the nucleosynthesis in the solar core. The practical answer, however, is more promising; it is a mix of the common sense we call reductionism and awareness that, especially when strongly coupled, subsystems can experience behavioral changes, breakdowns, and catastrophes. If the stock answer to the systemic tendency of geospace is complexity, the objective of the system-level approach to geospace science is to define, measure, and understand this complexity. I will use the example of magnetotail dynamics to illuminate some key points in this talk.

  6. Novel base-pairing interactions at the tRNA wobble position crucial for accurate reading of the genetic code

    Science.gov (United States)

    Rozov, Alexey; Demeshkina, Natalia; Khusainov, Iskander; Westhof, Eric; Yusupov, Marat; Yusupova, Gulnara

    2016-01-01

    Posttranscriptional modifications at the wobble position of transfer RNAs play a substantial role in deciphering the degenerate genetic code on the ribosome. The number and variety of modifications suggest different mechanisms of action during messenger RNA decoding, of which only a few were described so far. Here, on the basis of several 70S ribosome complex X-ray structures, we demonstrate how Escherichia coli tRNALysUUU with hypermodified 5-methylaminomethyl-2-thiouridine (mnm5s2U) at the wobble position discriminates between cognate codons AAA and AAG, and near-cognate stop codon UAA or isoleucine codon AUA, with which it forms pyrimidine-pyrimidine mismatches. We show that mnm5s2U forms an unusual pair with guanosine at the wobble position that expands general knowledge on the degeneracy of the genetic code and specifies a powerful role of tRNA modifications in translation. Our models consolidate the translational fidelity mechanism proposed previously where the steric complementarity and shape acceptance dominate the decoding mechanism.

  7. Remediating Viking Origins: Genetic Code as Archival Memory of the Remote Past.

    Science.gov (United States)

    Scully, Marc; King, Turi; Brown, Steven D

    2013-10-01

    This article introduces some early data from the Leverhulme Trust-funded research programme, 'The Impact of the Diasporas on the Making of Britain: evidence, memories, inventions'. One of the interdisciplinary foci of the programme, which incorporates insights from genetics, history, archaeology, linguistics and social psychology, is to investigate how genetic evidence of ancestry is incorporated into identity narratives. In particular, we investigate how 'applied genetic history' shapes individual and familial narratives, which are then situated within macro-narratives of the nation and collective memories of immigration and indigenism. It is argued that the construction of genetic evidence as a 'gold standard' about 'where you really come from' involves a remediation of cultural and archival memory, in the construction of a 'usable past'. This article is based on initial questionnaire data from a preliminary study of those attending DNA collection sessions in northern England. It presents some early indicators of the perceived importance of being of Viking descent among participants, notes some emerging patterns and considers the implications for contemporary debates on migration, belonging and local and national identity.

  8. Innovation of genetic algorithm code GenA for WWER fuel loading optimization

    International Nuclear Information System (INIS)

    Sustek, J.

    2005-01-01

    One of the stochastic search techniques - genetic algorithms - was recently used for optimization of arrangement of fuel assemblies (FA) in core of reactors WWER-440 and WWER-1000. Basic algorithm was modified by incorporation of SPEA scheme. Both were enhanced and some results are presented (Authors)

  9. Common and Rare Coding Genetic Variation Underlying the Electrocardiographic PR Interval

    DEFF Research Database (Denmark)

    Lin, Honghuang; van Setten, Jessica; Smith, Albert V

    2018-01-01

    BACKGROUND: Electrical conduction from the cardiac sinoatrial node to the ventricles is critical for normal heart function. Genome-wide association studies have identified more than a dozen common genetic loci that are associated with PR interval. However, it is unclear whether rare and low-frequ...

  10. Hologenomics: Systems-Level Host Biology.

    Science.gov (United States)

    Theis, Kevin R

    2018-01-01

    The hologenome concept of evolution is a hypothesis explaining host evolution in the context of the host microbiomes. As a hypothesis, it needs to be evaluated, especially with respect to the extent of fidelity of transgenerational coassociation of host and microbial lineages and the relative fitness consequences of repeated associations within natural holobiont populations. Behavioral ecologists are in a prime position to test these predictions because they typically focus on animal phenotypes that are quantifiable, conduct studies over multiple generations within natural animal populations, and collect metadata on genetic relatedness and relative reproductive success within these populations. Regardless of the conclusion on the hologenome concept as an evolutionary hypothesis, a hologenomic perspective has applied value as a systems-level framework for host biology, including in medicine. Specifically, it emphasizes investigating the multivarious and dynamic interactions between patient genomes and the genomes of their diverse microbiota when attempting to elucidate etiologies of complex, noninfectious diseases.

  11. Dynamics of genetic variation at gliadin-coding loci in bread wheat cultivars developed in small grains research center (Kragujevac during last 35 years

    Directory of Open Access Journals (Sweden)

    Novosljska-Dragovič Aleksandra

    2005-01-01

    Full Text Available Multiple alleles of gliadin-coding loci are well-known genetic markers of common wheat genotypes. Based on analysis of gliadin patterns in common wheat cultivars developed at the Small Grains Research Center in Kragujevac dynamics of genetic variability at gliadin-coding loci has been surveyed for the period of 35 years. It was shown that long-term breeding of the wheat cultivars involved gradual replacement of ancient alleles for those widely spread in some regions in the world, which belong to well-known cultivars-donor of some important traits. Developing cultivars whose pedigree involved much new foreign genetic material has increased genetic diversity as well as has changed frequency of alleles of gliadin-coding loci. So we can conclude that the genetic profile of modern Serbian cultivars has changed considerably. Genetic formula of gliadin was made for each the cultivar studied. The most frequent alleles of gliadin-coding loci among modern cultivars should be of great interest of breeders because these alleles are probably linked with genes that confer advantage to their carriers at present.

  12. Discovery of coding genetic variants influencing diabetes-related serum biomarkers and their impact on risk of type 2 diabetes

    DEFF Research Database (Denmark)

    Ahluwalia, Tarun Veer Singh; Allin, Kristine Højgaard; Sandholt, Camilla Helene

    2015-01-01

    CONTEXT: Type 2 diabetes (T2D) prevalence is spiraling globally, and knowledge of its pathophysiological signatures is crucial for a better understanding and treatment of the disease. OBJECTIVE: We aimed to discover underlying coding genetic variants influencing fasting serum levels of nine......-nucleotide polymorphisms and were tested for association with each biomarker. Identified loci were tested for association with T2D through a large-scale meta-analysis involving up to 17 024 T2D cases and up to 64 186 controls. RESULTS: We discovered 11 associations between single-nucleotide polymorphisms and five distinct......, of which the association with the CELSR2 locus has not been shown previously. CONCLUSION: The identified loci influence processes related to insulin signaling, cell communication, immune function, apoptosis, DNA repair, and oxidative stress, all of which could provide a rationale for novel diabetes...

  13. Numeral series hidden in the distribution of atomic mass of amino acids to codon domains in the genetic code.

    Science.gov (United States)

    Wohlin, Åsa

    2015-03-21

    The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  14. Aminotryptophan-containing barstar: structure--function tradeoff in protein design and engineering with an expanded genetic code.

    Science.gov (United States)

    Rubini, Marina; Lepthien, Sandra; Golbik, Ralph; Budisa, Nediljko

    2006-07-01

    The indole ring of the canonical amino acid tryptophan (Trp) possesses distinguished features, such as sterical bulk, hydrophobicity and the nitrogen atom which is capable of acting as a hydrogen bond donor. The introduction of an amino group into the indole moiety of Trp yields the structural analogs 4-aminotryptophan ((4-NH(2))Trp) and 5-aminotryptophan ((5-NH(2))Trp). Their hydrophobicity and spectral properties are substantially different when compared to those of Trp. They resemble the purine bases of DNA and share their capacity for pH-sensitive intramolecular charge transfer. The Trp --> aminotryptophan substitution in proteins during ribosomal translation is expected to result in related protein variants that acquire these features. These expectations have been fulfilled by incorporating (4-NH(2))Trp and (5-NH(2))Trp into barstar, an intracellular inhibitor of the ribonuclease barnase from Bacillus amyloliquefaciens. The crystal structure of (4-NH(2))Trp-barstar is similar to that of the parent protein, whereas its spectral and thermodynamic behavior is found to be remarkably different. The T(m) value of (4-NH(2))Trp- and (5-NH(2))Trp-barstar is lowered by about 20 degrees Celsius, and they exhibit a strongly reduced unfolding cooperativity and substantial loss of free energy in folding. Furthermore, folding kinetic study of (4-NH(2))Trp-barstar revealed that the denatured state is even preferred over native one. The combination of structural and thermodynamic analyses clearly shows how structures of substituted barstar display a typical structure-function tradeoff: the acquirement of unique pH-sensitive charge transfer as a novel function is achieved at the expense of protein stability. These findings provide a new insight into the evolution of the amino acid repertoire of the universal genetic code and highlight possible problems regarding protein engineering and design by using an expanded genetic code.

  15. Origin of an alternative genetic code in the extremely small and GC-rich genome of a bacterial symbiont.

    Directory of Open Access Journals (Sweden)

    John P McCutcheon

    2009-07-01

    Full Text Available The genetic code relates nucleotide sequence to amino acid sequence and is shared across all organisms, with the rare exceptions of lineages in which one or a few codons have acquired novel assignments. Recoding of UGA from stop to tryptophan has evolved independently in certain reduced bacterial genomes, including those of the mycoplasmas and some mitochondria. Small genomes typically exhibit low guanine plus cytosine (GC content, and this bias in base composition has been proposed to drive UGA Stop to Tryptophan (Stop-->Trp recoding. Using a combination of genome sequencing and high-throughput proteomics, we show that an alpha-Proteobacterial symbiont of cicadas has the unprecedented combination of an extremely small genome (144 kb, a GC-biased base composition (58.4%, and a coding reassignment of UGA Stop-->Trp. Although it is not clear why this tiny genome lacks the low GC content typical of other small bacterial genomes, these observations support a role of genome reduction rather than base composition as a driver of codon reassignment.

  16. Stochastic optimization of GeantV code by use of genetic algorithms

    Science.gov (United States)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) and handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. The goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.

  17. Human growth hormone-related latrogenic Creutzfeldt-Jakob disease: Search for a genetic susceptibility by analysis of the PRNP coding region

    Energy Technology Data Exchange (ETDEWEB)

    Jaegly, A.; Boussin, F.; Deslys, J.P. [CEA/CRSSA/DSV/DPTE, Fontenay-aux-Roses (France)] [and others

    1995-05-20

    The human PRNP gene encoding PrP is located on chromosome 20 and consists of two exons and a single intron. The open reading frame is entirely fitted into the second exon. Genetic studies indicate that all of the familial and several sporadic forms of TSSEs are associated with mutations in the PRNP 759-bp coding region. Moreover, homozygosity at codon 129, a locus harboring a polymorphism among the general population, was proposed as a genetic susceptibility marker for both sporadic and iatrogenic CJD. To assess whether additional genetic predisposition markers exist in the PRNP gene, the authors sequenced the PRNP coding region of 17 of the 32 French patients who developed a hGH-related CJD.

  18. Partitioning of genetic variation between regulatory and coding gene segments: the predominance of software variation in genes encoding introvert proteins.

    Science.gov (United States)

    Mitchison, A

    1997-01-01

    In considering genetic variation in eukaryotes, a fundamental distinction can be made between variation in regulatory (software) and coding (hardware) gene segments. For quantitative traits the bulk of variation, particularly that near the population mean, appears to reside in regulatory segments. The main exceptions to this rule concern proteins which handle extrinsic substances, here termed extrovert proteins. The immune system includes an unusually large proportion of this exceptional category, but even so its chief source of variation may well be polymorphism in regulatory gene segments. The main evidence for this view emerges from genome scanning for quantitative trait loci (QTL), which in the case of the immune system points to a major contribution of pro-inflammatory cytokine genes. Further support comes from sequencing of major histocompatibility complex (Mhc) class II promoters, where a high level of polymorphism has been detected. These Mhc promoters appear to act, in part at least, by gating the back-signal from T cells into antigen-presenting cells. Both these forms of polymorphism are likely to be sustained by the need for flexibility in the immune response. Future work on promoter polymorphism is likely to benefit from the input from genome informatics.

  19. Role of horizontal gene transfer as a control on the coevolution of ribosomal proteins and the genetic code

    Energy Technology Data Exchange (ETDEWEB)

    Woese, Carl R.; Goldenfeld, Nigel; Luthey-Schulten, Zaida

    2011-03-31

    Our main goal is to develop the conceptual and computational tools necessary to understand the evolution of the universal processes of translation and replication and to identify events of horizontal gene transfer that occurred within the components. We will attempt to uncover the major evolutionary transitions that accompanied the development of protein synthesis by the ribosome and associated components of the translation apparatus. Our project goes beyond standard genomic approaches to explore homologs that are represented at both the structure and sequence level. Accordingly, use of structural phylogenetic analysis allows us to probe further back into deep evolutionary time than competing approaches, permitting greater resolution of primitive folds and structures. Specifically, our work focuses on the elements of translation, ranging from the emergence of the canonical genetic code to the evolution of specific protein folds, mediated by the predominance of horizontal gene transfer in early life. A unique element of this study is the explicit accounting for the impact of phenotype selection on translation, through a coevolutionary control mechanism. Our work contributes to DOE mission objectives through: (1) sophisticated computer simulation of protein dynamics and evolution, and the further refinement of techniques for structural phylogeny, which complement sequence information, leading to improved annotation of genomic databases; (2) development of evolutionary approaches to exploring cellular function and machinery in an integrated way; and (3) documentation of the phenotype interaction with translation over evolutionary time, reflecting the system response to changing selection pressures through horizontal gene transfer.

  20. Genetic Recombination Between Stromal and Cancer Cells Results in Highly Malignant Cells Identified by Color-Coded Imaging in a Mouse Lymphoma Model.

    Science.gov (United States)

    Nakamura, Miki; Suetsugu, Atsushi; Hasegawa, Kousuke; Matsumoto, Takuro; Aoki, Hitomi; Kunisada, Takahiro; Shimizu, Masahito; Saji, Shigetoyo; Moriwaki, Hisataka; Hoffman, Robert M

    2017-12-01

    The tumor microenvironment (TME) promotes tumor growth and metastasis. We previously established the color-coded EL4 lymphoma TME model with red fluorescent protein (RFP) expressing EL4 implanted in transgenic C57BL/6 green fluorescent protein (GFP) mice. Color-coded imaging of the lymphoma TME suggested an important role of stromal cells in lymphoma progression and metastasis. In the present study, we used color-coded imaging of RFP-lymphoma cells and GFP stromal cells to identify yellow-fluorescent genetically recombinant cells appearing only during metastasis. The EL4-RFP lymphoma cells were injected subcutaneously in C57BL/6-GFP transgenic mice and formed subcutaneous tumors 14 days after cell transplantation. The subcutaneous tumors were harvested and transplanted to the abdominal cavity of nude mice. Metastases to the liver, perigastric lymph node, ascites, bone marrow, and primary tumor were imaged. In addition to EL4-RFP cells and GFP-host cells, genetically recombinant yellow-fluorescent cells, were observed only in the ascites and bone marrow. These results indicate genetic exchange between the stromal and cancer cells. Possible mechanisms of genetic exchange are discussed as well as its ramifications for metastasis. J. Cell. Biochem. 118: 4216-4221, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Cameroonian fruit bats harbor divergent viruses, including rotavirus H, bastroviruses, and picobirnaviruses using an alternative genetic code.

    Science.gov (United States)

    Yinda, Claude Kwe; Ghogomu, Stephen Mbigha; Conceição-Neto, Nádia; Beller, Leen; Deboutte, Ward; Vanhulle, Emiel; Maes, Piet; Van Ranst, Marc; Matthijnssens, Jelle

    2018-01-01

    Most human emerging infectious diseases originate from wildlife and bats are a major reservoir of viruses, a few of which have been highly pathogenic to humans. In some regions of Cameroon, bats are hunted and eaten as a delicacy. This close proximity between human and bats provides ample opportunity for zoonotic events. To elucidate the viral diversity of Cameroonian fruit bats, we collected and metagenomically screened eighty-seven fecal samples of Eidolon helvum and Epomophorus gambianus fruit bats. The results showed a plethora of known and novel viruses. Phylogenetic analyses of the eleven gene segments of the first complete bat rotavirus H genome, showed clearly separated clusters of human, porcine, and bat rotavirus H strains, not indicating any recent interspecies transmission events. Additionally, we identified and analyzed a bat bastrovirus genome (a novel group of recently described viruses, related to astroviruses and hepatitis E viruses), confirming their recombinant nature, and provide further evidence of additional recombination events among bat bastroviruses. Interestingly, picobirnavirus-like RNA-dependent RNA polymerase gene segments were identified using an alternative mitochondrial genetic code, and further principal component analyses suggested that they may have a similar lifestyle to mitoviruses, a group of virus-like elements known to infect the mitochondria of fungi. Although identified bat coronavirus, parvovirus, and cyclovirus strains belong to established genera, most of the identified partitiviruses and densoviruses constitute putative novel genera in their respective families. Finally, the results of the phage community analyses of these bats indicate a very diverse geographically distinct bat phage population, probably reflecting different diets and gut bacterial ecosystems.

  2. An efficient genetic algorithm for structural RNA pairwise alignment and its application to non-coding RNA discovery in yeast

    Directory of Open Access Journals (Sweden)

    Taneda Akito

    2008-12-01

    Full Text Available Abstract Background Aligning RNA sequences with low sequence identity has been a challenging problem since such a computation essentially needs an algorithm with high complexities for taking structural conservation into account. Although many sophisticated algorithms for the purpose have been proposed to date, further improvement in efficiency is necessary to accelerate its large-scale applications including non-coding RNA (ncRNA discovery. Results We developed a new genetic algorithm, Cofolga2, for simultaneously computing pairwise RNA sequence alignment and consensus folding, and benchmarked it using BRAliBase 2.1. The benchmark results showed that our new algorithm is accurate and efficient in both time and memory usage. Then, combining with the originally trained SVM, we applied the new algorithm to novel ncRNA discovery where we compared S. cerevisiae genome with six related genomes in a pairwise manner. By focusing our search to the relatively short regions (50 bp to 2,000 bp sandwiched by conserved sequences, we successfully predict 714 intergenic and 1,311 sense or antisense ncRNA candidates, which were found in the pairwise alignments with stable consensus secondary structure and low sequence identity (≤ 50%. By comparing with the previous predictions, we found that > 92% of the candidates is novel candidates. The estimated rate of false positives in the predicted candidates is 51%. Twenty-five percent of the intergenic candidates has supports for expression in cell, i.e. their genomic positions overlap those of the experimentally determined transcripts in literature. By manual inspection of the results, moreover, we obtained four multiple alignments with low sequence identity which reveal consensus structures shared by three species/sequences. Conclusion The present method gives an efficient tool complementary to sequence-alignment-based ncRNA finders.

  3. Features, Events, and Processes: System Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-04-19

    The primary purpose of this analysis is to evaluate System Level features, events, and processes (FEPs). The System Level FEPs typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem level analyses and models reports. The System Level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. This evaluation determines which of the System Level FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the information presented in analysis reports, model reports, direct input, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  4. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    Science.gov (United States)

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  5. Instance-based Policy Learning by Real-coded Genetic Algorithms and Its Application to Control of Nonholonomic Systems

    Science.gov (United States)

    Miyamae, Atsushi; Sakuma, Jun; Ono, Isao; Kobayashi, Shigenobu

    The stabilization control of nonholonomic systems have been extensively studied because it is essential for nonholonomic robot control problems. The difficulty in this problem is that the theoretical derivation of control policy is not necessarily guaranteed achievable. In this paper, we present a reinforcement learning (RL) method with instance-based policy (IBP) representation, in which control policies for this class are optimized with respect to user-defined cost functions. Direct policy search (DPS) is an approach for RL; the policy is represented by parametric models and the model parameters are directly searched by optimization techniques including genetic algorithms (GAs). In IBP representation an instance consists of a state and an action pair; a policy consists of a set of instances. Several DPSs with IBP have been previously proposed. In these methods, sometimes fail to obtain optimal control policies when state-action variables are continuous. In this paper, we present a real-coded GA for DPSs with IBP. Our method is specifically designed for continuous domains. Optimization of IBP has three difficulties; high-dimensionality, epistasis, and multi-modality. Our solution is designed for overcoming these difficulties. The policy search with IBP representation appears to be high-dimensional optimization; however, instances which can improve the fitness are often limited to active instances (instances used for the evaluation). In fact, the number of active instances is small. Therefore, we treat the search problem as a low dimensional problem by restricting search variables only to active instances. It has been commonly known that functions with epistasis can be efficiently optimized with crossovers which satisfy the inheritance of statistics. For efficient search of IBP, we propose extended crossover-like mutation (extended XLM) which generates a new instance around an instance with satisfying the inheritance of statistics. For overcoming multi-modality, we

  6. The Future of Genetics in Psychology and Psychiatry: Microarrays, Genome-Wide Association, and Non-Coding RNA

    Science.gov (United States)

    Plomin, Robert; Davis, Oliver S. P.

    2009-01-01

    Background: Much of what we thought we knew about genetics needs to be modified in light of recent discoveries. What are the implications of these advances for identifying genes responsible for the high heritability of many behavioural disorders and dimensions in childhood? Methods: Although quantitative genetics such as twin studies will continue…

  7. System level ESD co-design

    CERN Document Server

    Gossner, Harald

    2015-01-01

    An effective and cost efficient protection of electronic system against ESD stress pulses specified by IEC 61000-4-2 is paramount for any system design. This pioneering book presents the collective knowledge of system designers and system testing experts and state-of-the-art techniques for achieving efficient system-level ESD protection, with minimum impact on the system performance. All categories of system failures ranging from ‘hard’ to ‘soft’ types are considered to review simulation and tool applications that can be used. The principal focus of System Level ESD Co-Design is defining and establishing the importance of co-design efforts from both IC supplier and system builder perspectives. ESD designers often face challenges in meeting customers' system-level ESD requirements and, therefore, a clear understanding of the techniques presented here will facilitate effective simulation approaches leading to better solutions without compromising system performance. With contributions from Robert Asht...

  8. Genic non-coding microsatellites in the rice genome: characterization, marker design and use in assessing genetic and evolutionary relationships among domesticated groups

    Directory of Open Access Journals (Sweden)

    Singh Nagendra

    2009-03-01

    Full Text Available Abstract Background Completely sequenced plant genomes provide scope for designing a large number of microsatellite markers, which are useful in various aspects of crop breeding and genetic analysis. With the objective of developing genic but non-coding microsatellite (GNMS markers for the rice (Oryza sativa L. genome, we characterized the frequency and relative distribution of microsatellite repeat-motifs in 18,935 predicted protein coding genes including 14,308 putative promoter sequences. Results We identified 19,555 perfect GNMS repeats with densities ranging from 306.7/Mb in chromosome 1 to 450/Mb in chromosome 12 with an average of 357.5 GNMS per Mb. The average microsatellite density was maximum in the 5' untranslated regions (UTRs followed by those in introns, promoters, 3'UTRs and minimum in the coding sequences (CDS. Primers were designed for 17,966 (92% GNMS repeats, including 4,288 (94% hypervariable class I types, which were bin-mapped on the rice genome. The GNMS markers were most polymorphic in the intronic region (73.3% followed by markers in the promoter region (53.3% and least in the CDS (26.6%. The robust polymerase chain reaction (PCR amplification efficiency and high polymorphic potential of GNMS markers over genic coding and random genomic microsatellite markers suggest their immediate use in efficient genotyping applications in rice. A set of these markers could assess genetic diversity and establish phylogenetic relationships among domesticated rice cultivar groups. We also demonstrated the usefulness of orthologous and paralogous conserved non-coding microsatellite (CNMS markers, identified in the putative rice promoter sequences, for comparative physical mapping and understanding of evolutionary and gene regulatory complexities among rice and other members of the grass family. The divergence between long-grained aromatics and subspecies japonica was estimated to be more recent (0.004 Mya compared to short

  9. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  10. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  11. Features, Events, and Processes: system Level

    International Nuclear Information System (INIS)

    D. McGregor

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760)

  12. Systems-Level Synthetic Biology for Advanced Biofuel Production

    Energy Technology Data Exchange (ETDEWEB)

    Ruffing, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Travis J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strickland, Lucas Marshall [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tallant, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcus sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.

  13. Genetics

    International Nuclear Information System (INIS)

    Hubitschek, H.E.

    1975-01-01

    Progress is reported on the following research projects: genetic effects of high LET radiations; genetic regulation, alteration, and repair; chromosome replication and the division cycle of Escherichia coli; effects of radioisotope decay in the DNA of microorganisms; initiation and termination of DNA replication in Bacillus subtilis; mutagenesis in mouse myeloma cells; lethal and mutagenic effects of near-uv radiation; effect of 8-methoxypsoralen on photodynamic lethality and mutagenicity in Escherichia coli; DNA repair of the lethal effects of far-uv; and near uv irradiation of bacterial cells

  14. Assessment of genetic mutations in the XRCC2 coding region by high resolution melting curve analysis and the risk of differentiated thyroid carcinoma in Iran

    Directory of Open Access Journals (Sweden)

    Shima Fayaz

    2012-01-01

    Full Text Available Homologous recombination (HR is the major pathway for repairing double strand breaks (DSBs in eukaryotes and XRCC2 is an essential component of the HR repair machinery. To evaluate the potential role of mutations in gene repair by HR in individuals susceptible to differentiated thyroid carcinoma (DTC we used high resolution melting (HRM analysis, a recently introduced method for detecting mutations, to examine the entire XRCC2 coding region in an Iranian population. HRM analysis was used to screen for mutations in three XRCC2 coding regions in 50 patients and 50 controls. There was no variation in the HRM curves obtained from the analysis of exons 1 and 2 in the case and control groups. In exon 3, an Arg188His polymorphism (rs3218536 was detected as a new melting curve group (OR: 1.46; 95%CI: 0.432-4.969; p = 0.38 compared with the normal melting curve. We also found a new Ser150Arg polymorphism in exon 3 of the control group. These findings suggest that genetic variations in the XRCC2 coding region have no potential effects on susceptibility to DTC. However, further studies with larger populations are required to confirm this conclusion.

  15. Genetics

    DEFF Research Database (Denmark)

    Christensen, Kaare; McGue, Matt

    2016-01-01

    The sequenced genomes of individuals aged ≥80 years, who were highly educated, self-referred volunteers and with no self-reported chronic diseases were compared to young controls. In these data, healthy ageing is a distinct phenotype from exceptional longevity and genetic factors that protect...

  16. Use of PRIM code to analyze potential radiation-induced genetic and somatic effects to man from Jackpile-Paguate mines

    International Nuclear Information System (INIS)

    Momeni, M.H.

    1983-01-01

    Potential radiation-induced effects from inhalation and ingestion of land external exposure to radioactive materials at the Jackpile-Paguate uranium mine complex near Paguate, New Mexico, were analyzed. The Uranium Dispersion and Dosimetry (UDAD) computer code developed at Argonne National Laboratory was used to calculate the dose rates and the time-integrated doses to tissues at risk as a function of age and time for the population within 80 km of the mines. The ANL computer code Potential Radiation-Induced Biological Effects on Man (PRIM) then was used to calculate the potential radiation-induced somatic and genetic effects among the same population on the basis of absolute and relative risk models as a function of duration of exposure and age at time of exposure. The analyses were based on the recommendations in BEIR II and WASH-1400 and the lifetable method. The death rates were calculated for radiation exposure from the mines and for naturally induced effects for 19 age cohorts, 20 time intervals, and for each sex. The results indicated that under present conditions of the radiation environment at the mines, the number of potential fatal radiation-induced neoplasms that could occur among the regional population over the next 85 years would be 95 using the absolute risk model, and 243 using the relative risk model. Over the same period, there would be less than two radiation-induced genetic effects (dominant and multifactorials). After decommissioning f the mine site, these risks would decrease to less than 1 and less than 3 potential radiation-induced deaths under the relative and absolute risk models, respectively, and 0.001 genetic disorders. Because of various sources of error, the uncertainty in these predicted risks could be a factor of five

  17. The Poitiers School of Mathematical and Theoretical Biology: Besson-Gavaudan-Schützenberger's Conjectures on Genetic Code and RNA Structures.

    Science.gov (United States)

    Demongeot, J; Hazgui, H

    2016-12-01

    The French school of theoretical biology has been mainly initiated in Poitiers during the sixties by scientists like J. Besson, G. Bouligand, P. Gavaudan, M. P. Schützenberger and R. Thom, launching many new research domains on the fractal dimension, the combinatorial properties of the genetic code and related amino-acids as well as on the genetic regulation of the biological processes. Presently, the biological science knows that RNA molecules are often involved in the regulation of complex genetic networks as effectors, e.g., activators (small RNAs as transcription factors), inhibitors (micro-RNAs) or hybrids (circular RNAs). Examples of such networks will be given showing that (1) there exist RNA "relics" that have played an important role during evolution and have survived in many genomes, whose probability distribution of their sub-sequences is quantified by the Shannon entropy, and (2) the robustness of the dynamics of the networks they regulate can be characterized by the Kolmogorov-Sinaï dynamic entropy and attractor entropy.

  18. Accelerating next generation sequencing data analysis with system level optimizations.

    Science.gov (United States)

    Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid

    2017-08-22

    Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.

  19. Social Welfare Improvement by TCSC using Real Code Based Genetic Algorithm in Double-Sided Auction Market

    Directory of Open Access Journals (Sweden)

    MASOUM, M. A. S.

    2011-05-01

    Full Text Available This paper presents a genetic algorithm (GA to maximize total system social welfare and alleviate congestion by best placement and sizing of TCSC device, in a double-sided auction market. To introduce more accurate modeling, the valve loading effects is incorporated to the conventional quadratic smooth generator cost curves. By adding the valve point effect, the model presents nondifferentiable and nonconvex regions that challenge most gradient-based optimization algorithms. In addition, quadratic consumer benefit functions integrated in the objective function to guarantee that locational marginal prices charged at the demand buses is less than or equal to DisCos benefit, earned by selling that power to retail customers. The proposed approach makes use of the genetic algorithm to optimal schedule GenCos, DisCos and TCSC location and size, while the Newton-Raphson algorithm minimizes the mismatch of the power flow equations. Simulation results on the modified IEEE 14-bus and 30-bus test systems (with/without line flow constraints, before and after the compensation are used to examine the impact of TCSC on the total system social welfare improvement. Several cases are considered to test and validate the consistency of detecting best solutions. Simulation results are compared to solutions obtained by sequential quadratic programming (SQP approaches.

  20. A method to optimize the shield compact and lightweight combining the structure with components together by genetic algorithm and MCNP code.

    Science.gov (United States)

    Cai, Yao; Hu, Huasi; Pan, Ziheng; Hu, Guang; Zhang, Tao

    2018-05-17

    To optimize the shield for neutrons and gamma rays compact and lightweight, a method combining the structure and components together was established employing genetic algorithms and MCNP code. As a typical case, the fission energy spectrum of 235 U which mixed neutrons and gamma rays was adopted in this study. Six types of materials were presented and optimized by the method. Spherical geometry was adopted in the optimization after checking the geometry effect. Simulations have made to verify the reliability of the optimization method and the efficiency of the optimized materials. To compare the materials visually and conveniently, the volume and weight needed to build a shield are employed. The results showed that, the composite multilayer material has the best performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. An RNA Phage Lab: MS2 in Walter Fiers' laboratory of molecular biology in Ghent, from genetic code to gene and genome, 1963-1976.

    Science.gov (United States)

    Pierrel, Jérôme

    2012-01-01

    The importance of viruses as model organisms is well-established in molecular biology and Max Delbrück's phage group set standards in the DNA phage field. In this paper, I argue that RNA phages, discovered in the 1960s, were also instrumental in the making of molecular biology. As part of experimental systems, RNA phages stood for messenger RNA (mRNA), genes and genome. RNA was thought to mediate information transfers between DNA and proteins. Furthermore, RNA was more manageable at the bench than DNA due to the availability of specific RNases, enzymes used as chemical tools to analyse RNA. Finally, RNA phages provided scientists with a pure source of mRNA to investigate the genetic code, genes and even a genome sequence. This paper focuses on Walter Fiers' laboratory at Ghent University (Belgium) and their work on the RNA phage MS2. When setting up his Laboratory of Molecular Biology, Fiers planned a comprehensive study of the virus with a strong emphasis on the issue of structure. In his lab, RNA sequencing, now a little-known technique, evolved gradually from a means to solve the genetic code, to a tool for completing the first genome sequence. Thus, I follow the research pathway of Fiers and his 'RNA phage lab' with their evolving experimental system from 1960 to the late 1970s. This study illuminates two decisive shifts in post-war biology: the emergence of molecular biology as a discipline in the 1960s in Europe and of genomics in the 1990s.

  2. System Level Analysis of LTE-Advanced

    DEFF Research Database (Denmark)

    Wang, Yuanye

    This PhD thesis focuses on system level analysis of Multi-Component Carrier (CC) management for Long Term Evolution (LTE)-Advanced. Cases where multiple CCs are aggregated to form a larger bandwidth are studied. The analysis is performed for both local area and wide area networks. In local area...... reduction. Compared to the case of reuse-1, they achieve a gain of 50∼500% in cell edge user throughput, with small or no loss in average cell throughput. For the wide area network, effort is devoted to the downlink of LTE-Advanced. Such a system is assumed to be backwards compatible to LTE release 8, i...... scheme is recommended. It reduces the CQI by 94% at low load, and 79∼93% at medium to high load, with reasonable loss in downlink performance. To reduce the ACK/NACK feedback, multiple ACK/NACKs can be bundled, with slightly degraded downlink throughput....

  3. An expanded genetic code for probing the role of electrostatics in enzyme catalysis by vibrational Stark spectroscopy.

    Science.gov (United States)

    Völler, Jan-Stefan; Biava, Hernan; Hildebrandt, Peter; Budisa, Nediljko

    2017-11-01

    To find experimental validation for electrostatic interactions essential for catalytic reactions represents a challenge due to practical limitations in assessing electric fields within protein structures. This review examines the applications of non-canonical amino acids (ncAAs) as genetically encoded probes for studying the role of electrostatic interactions in enzyme catalysis. ncAAs constitute sensitive spectroscopic probes to detect local electric fields by exploiting the vibrational Stark effect (VSE) and thus have the potential to map the protein electrostatics. Mapping the electrostatics in proteins will improve our understanding of natural catalytic processes and, in beyond, will be helpful for biocatalyst engineering. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A Real-Coded Genetic Algorithm with System Reduction and Restoration for Rapid and Reliable Power Flow Solution of Power Systems

    Directory of Open Access Journals (Sweden)

    Hassan Abdullah Kubba

    2015-05-01

    Full Text Available The paper presents a highly accurate power flow solution, reducing the possibility of ending at local minima, by using Real-Coded Genetic Algorithm (RCGA with system reduction and restoration. The proposed method (RCGA is modified to reduce the total computing time by reducing the system in size to that of the generator buses, which, for any realistic system, will be smaller in number, and the load buses are eliminated. Then solving the power flow problem for the generator buses only by real-coded GA to calculate the voltage phase angles, whereas the voltage magnitudes are specified resulted in reduced computation time for the solution. Then the system is restored by calculating the voltages of the load buses in terms of the calculated voltages of the generator buses, after a derivation of equations for calculating the voltages of the load busbars. The proposed method was demonstrated on 14-bus IEEE test systems and the practical system 362-busbar IRAQI NATIONAL GRID (ING. The proposed method has reliable convergence, a highly accurate solution and less computing time for on-line applications. The method can conveniently be applied for on-line analysis and planning studies of large power systems.

  5. Genetic Predictions of Prion Disease Susceptibility in Carnivore Species Based on Variability of the Prion Gene Coding Region

    Science.gov (United States)

    Stewart, Paula; Campbell, Lauren; Skogtvedt, Susan; Griffin, Karen A.; Arnemo, Jon M.; Tryland, Morten; Girling, Simon; Miller, Michael W.; Tranulis, Michael A.; Goldmann, Wilfred

    2012-01-01

    Mammalian species vary widely in their apparent susceptibility to prion diseases. For example, several felid species developed prion disease (feline spongiform encephalopathy or FSE) during the bovine spongiform encephalopathy (BSE) epidemic in the United Kingdom, whereas no canine BSE cases were detected. Whether either of these or other groups of carnivore species can contract other prion diseases (e.g. chronic wasting disease or CWD) remains an open question. Variation in the host-encoded prion protein (PrPC) largely explains observed disease susceptibility patterns within ruminant species, and may explain interspecies differences in susceptibility as well. We sequenced and compared the open reading frame of the PRNP gene encoding PrPC protein from 609 animal samples comprising 29 species from 22 genera of the Order Carnivora; amongst these samples were 15 FSE cases. Our analysis revealed that FSE cases did not encode an identifiable disease-associated PrP polymorphism. However, all canid PrPs contained aspartic acid or glutamic acid at codon 163 which we propose provides a genetic basis for observed susceptibility differences between canids and felids. Among other carnivores studied, wolverine (Gulo gulo) and pine marten (Martes martes) were the only non-canid species to also express PrP-Asp163, which may impact on their prion diseases susceptibility. Populations of black bear (Ursus americanus) and mountain lion (Puma concolor) from Colorado showed little genetic variation in the PrP protein and no variants likely to be highly resistant to prions in general, suggesting that strain differences between BSE and CWD prions also may contribute to the limited apparent host range of the latter. PMID:23236380

  6. Genetic predictions of prion disease susceptibility in carnivore species based on variability of the prion gene coding region.

    Directory of Open Access Journals (Sweden)

    Paula Stewart

    Full Text Available Mammalian species vary widely in their apparent susceptibility to prion diseases. For example, several felid species developed prion disease (feline spongiform encephalopathy or FSE during the bovine spongiform encephalopathy (BSE epidemic in the United Kingdom, whereas no canine BSE cases were detected. Whether either of these or other groups of carnivore species can contract other prion diseases (e.g. chronic wasting disease or CWD remains an open question. Variation in the host-encoded prion protein (PrP(C largely explains observed disease susceptibility patterns within ruminant species, and may explain interspecies differences in susceptibility as well. We sequenced and compared the open reading frame of the PRNP gene encoding PrP(C protein from 609 animal samples comprising 29 species from 22 genera of the Order Carnivora; amongst these samples were 15 FSE cases. Our analysis revealed that FSE cases did not encode an identifiable disease-associated PrP polymorphism. However, all canid PrPs contained aspartic acid or glutamic acid at codon 163 which we propose provides a genetic basis for observed susceptibility differences between canids and felids. Among other carnivores studied, wolverine (Gulo gulo and pine marten (Martes martes were the only non-canid species to also express PrP-Asp163, which may impact on their prion diseases susceptibility. Populations of black bear (Ursus americanus and mountain lion (Puma concolor from Colorado showed little genetic variation in the PrP protein and no variants likely to be highly resistant to prions in general, suggesting that strain differences between BSE and CWD prions also may contribute to the limited apparent host range of the latter.

  7. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding

    Directory of Open Access Journals (Sweden)

    Charlotte D’Hulst

    2016-07-01

    Full Text Available Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs in the main olfactory epithelium express the same odorant receptor (OR in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these “MouSensors.” In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction.

  8. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding.

    Science.gov (United States)

    D'Hulst, Charlotte; Mina, Raena B; Gershon, Zachary; Jamet, Sophie; Cerullo, Antonio; Tomoiaga, Delia; Bai, Li; Belluscio, Leonardo; Rogers, Matthew E; Sirotin, Yevgeniy; Feinstein, Paul

    2016-07-26

    Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs) in the main olfactory epithelium express the same odorant receptor (OR) in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these "MouSensors." In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  10. All about Genetics (For Parents)

    Science.gov (United States)

    ... Videos for Educators Search English Español All About Genetics KidsHealth / For Parents / All About Genetics What's in ... the way they pick up special laboratory dyes. Genetic Problems Errors in the genetic code or "gene ...

  11. Improved Transient Performance of a Fuzzy Modified Model Reference Adaptive Controller for an Interacting Coupled Tank System Using Real-Coded Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Asan Mohideen Khansadurai

    2014-01-01

    Full Text Available The main objective of the paper is to design a model reference adaptive controller (MRAC with improved transient performance. A modification to the standard direct MRAC called fuzzy modified MRAC (FMRAC is used in the paper. The FMRAC uses a proportional control based Mamdani-type fuzzy logic controller (MFLC to improve the transient performance of a direct MRAC. The paper proposes the application of real-coded genetic algorithm (RGA to tune the membership function parameters of the proposed FMRAC offline so that the transient performance of the FMRAC is improved further. In this study, a GA based modified MRAC (GAMMRAC, an FMRAC, and a GA based FMRAC (GAFMRAC are designed for a coupled tank setup in a hybrid tank process and their transient performances are compared. The results show that the proposed GAFMRAC gives a better transient performance than the GAMMRAC or the FMRAC. It is concluded that the proposed controller can be used to obtain very good transient performance for the control of nonlinear processes.

  12. A bacterial genetic screen identifies functional coding sequences of the insect mariner transposable element Famar1 amplified from the genome of the earwig, Forficula auricularia.

    Science.gov (United States)

    Barry, Elizabeth G; Witherspoon, David J; Lampe, David J

    2004-02-01

    Transposons of the mariner family are widespread in animal genomes and have apparently infected them by horizontal transfer. Most species carry only old defective copies of particular mariner transposons that have diverged greatly from their active horizontally transferred ancestor, while a few contain young, very similar, and active copies. We report here the use of a whole-genome screen in bacteria to isolate somewhat diverged Famar1 copies from the European earwig, Forficula auricularia, that encode functional transposases. Functional and nonfunctional coding sequences of Famar1 and nonfunctional copies of Ammar1 from the European honey bee, Apis mellifera, were sequenced to examine their molecular evolution. No selection for sequence conservation was detected in any clade of a tree derived from these sequences, not even on branches leading to functional copies. This agrees with the current model for mariner transposon evolution that expects neutral evolution within particular hosts, with selection for function occurring only upon horizontal transfer to a new host. Our results further suggest that mariners are not finely tuned genetic entities and that a greater amount of sequence diversification than had previously been appreciated can occur in functional copies in a single host lineage. Finally, this method of isolating active copies can be used to isolate other novel active transposons without resorting to reconstruction of ancestral sequences.

  13. Evidence for systems-level molecular mechanisms of tumorigenesis

    Directory of Open Access Journals (Sweden)

    Capellá Gabriel

    2007-06-01

    Full Text Available Abstract Background Cancer arises from the consecutive acquisition of genetic alterations. Increasing evidence suggests that as a consequence of these alterations, molecular interactions are reprogrammed in the context of highly connected and regulated cellular networks. Coordinated reprogramming would allow the cell to acquire the capabilities for malignant growth. Results Here, we determine the coordinated function of cancer gene products (i.e., proteins encoded by differentially expressed genes in tumors relative to healthy tissue counterparts, hereafter referred to as "CGPs" defined as their topological properties and organization in the interactome network. We show that CGPs are central to information exchange and propagation and that they are specifically organized to promote tumorigenesis. Centrality is identified by both local (degree and global (betweenness and closeness measures, and systematically appears in down-regulated CGPs. Up-regulated CGPs do not consistently exhibit centrality, but both types of cancer products determine the overall integrity of the network structure. In addition to centrality, down-regulated CGPs show topological association that correlates with common biological processes and pathways involved in tumorigenesis. Conclusion Given the current limited coverage of the human interactome, this study proposes that tumorigenesis takes place in a specific and organized way at the molecular systems-level and suggests a model that comprises the precise down-regulation of groups of topologically-associated proteins involved in particular functions, orchestrated with the up-regulation of specific proteins.

  14. A symmetry model for genetic coding via a wallpaper group composed of the traditional four bases and an imaginary base E: towards category theory-like systematization of molecular/genetic biology.

    Science.gov (United States)

    Sawamura, Jitsuki; Morishita, Shigeru; Ishigooka, Jun

    2014-05-07

    methodology, there is fertile ground to consider a symmetry model for genetic coding based on our specific wallpaper group. A more integrated formulation containing "central dogma" for future molecular/genetic biology remains to be explored.

  15. The lack of foundation in the mechanism on which are based the physico-chemical theories for the origin of the genetic code is counterposed to the credible and natural mechanism suggested by the coevolution theory.

    Science.gov (United States)

    Di Giulio, Massimo

    2016-06-21

    I analyze the mechanism on which are based the majority of theories that put to the center of the origin of the genetic code the physico-chemical properties of amino acids. As this mechanism is based on excessive mutational steps, I conclude that it could not have been operative or if operative it would not have allowed a full realization of predictions of these theories, because this mechanism contained, evidently, a high indeterminacy. I make that disapproving the four-column theory of the origin of the genetic code (Higgs, 2009) and reply to the criticism that was directed towards the coevolution theory of the origin of the genetic code. In this context, I suggest a new hypothesis that clarifies the mechanism by which the domains of codons of the precursor amino acids would have evolved, as predicted by the coevolution theory. This mechanism would have used particular elongation factors that would have constrained the evolution of all amino acids belonging to a given biosynthetic family to the progenitor pre-tRNA, that for first recognized, the first codons that evolved in a certain codon domain of a determined precursor amino acid. This happened because the elongation factors recognized two characteristics of the progenitor pre-tRNAs of precursor amino acids, which prevented the elongation factors from recognizing the pre-tRNAs belonging to biosynthetic families of different precursor amino acids. Finally, I analyze by means of Fisher's exact test, the distribution, within the genetic code, of the biosynthetic classes of amino acids and the ones of polarity values of amino acids. This analysis would seem to support the biosynthetic classes of amino acids over the ones of polarity values, as the main factor that led to the structuring of the genetic code, with the physico-chemical properties of amino acids playing only a subsidiary role in this evolution. As a whole, the full analysis brings to the conclusion that the coevolution theory of the origin of the

  16. A genetic polymorphism in the coding region of the gastric intrinsic factor gene (GIF) is associated with congenital intrinsic factor deficiency.

    Science.gov (United States)

    Gordon, Marilyn M; Brada, Nancy; Remacha, Angel; Badell, Isabel; del Río, Elisabeth; Baiget, Montserrat; Santer, René; Quadros, Edward V; Rothenberg, Sheldon P; Alpers, David H

    2004-01-01

    Congenital intrinsic factor (IF) deficiency is a disorder characterized by megaloblastic anemia due to the absence of gastric IF (GIF, GenBank NM_005142) and GIF antibodies, with probable autosomal recessive inheritance. Most of the reported patients are isolated cases without genetic studies of the parents or siblings. Complete exonic sequences were determined from the PCR products generated from genomic DNA of five affected individuals. All probands had the identical variant (g.68A>G) in the second position of the fifth codon in the coding sequence of the gene that introduces a restriction enzyme site for Msp I and predicts a change in the mature protein from glutamine(5) (CAG) to arginine(5) (CGG). Three subjects were homozygous for this base exchange and two subjects were heterozygous, one of which was apparently a compound heterozygote at positions 1 and 2 of the fifth codon ([g.67C>G] + [g.68A>G]). The other patient, heterozygous for position 2, had one heterozygous unaffected parent. Most parents were heterozygous for this base exchange, confirming the pattern of autosomal recessive inheritance for congenital IF deficiency. cDNA encoding GIF was mutated at base pair g.68 (A>G) and expressed in COS-7 cells. The apparent size, secretion rate, and sensitivity to pepsin hydrolysis of the expressed IF were similar to native IF. The allelic frequency of g.68A>G was 0.067 and 0.038 in two control populations. This sequence aberration is not the cause of the phenotype, but is associated with the genotype of congenital IF deficiency and could serve as a marker for inheritance of this disorder. Copyright 2003 Wiley-Liss, Inc.

  17. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  18. A novel pseudoderivative-based mutation operator for real-coded adaptive genetic algorithms [v2; ref status: indexed, http://f1000r.es/1td

    OpenAIRE

    Maxinder S Kanwal; Avinash S Ramesh; Lauren A Huang

    2013-01-01

    Recent development of large databases, especially those in genetics and proteomics, is pushing the development of novel computational algorithms that implement rapid and accurate search strategies. One successful approach has been to use artificial intelligence and methods, including pattern recognition (e.g. neural networks) and optimization techniques (e.g. genetic algorithms). The focus of this paper is on optimizing the design of genetic algorithms by using an adaptive mutation rate that ...

  19. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  20. A novel pseudoderivative-based mutation operator for real-coded adaptive genetic algorithms [v2; ref status: indexed, http://f1000r.es/1td

    Directory of Open Access Journals (Sweden)

    Maxinder S Kanwal

    2013-11-01

    Full Text Available Recent development of large databases, especially those in genetics and proteomics, is pushing the development of novel computational algorithms that implement rapid and accurate search strategies. One successful approach has been to use artificial intelligence and methods, including pattern recognition (e.g. neural networks and optimization techniques (e.g. genetic algorithms. The focus of this paper is on optimizing the design of genetic algorithms by using an adaptive mutation rate that is derived from comparing the fitness values of successive generations. We propose a novel pseudoderivative-based mutation rate operator designed to allow a genetic algorithm to escape local optima and successfully continue to the global optimum. Once proven successful, this algorithm can be implemented to solve real problems in neurology and bioinformatics. As a first step towards this goal, we tested our algorithm on two 3-dimensional surfaces with multiple local optima, but only one global optimum, as well as on the N-queens problem, an applied problem in which the function that maps the curve is implicit. For all tests, the adaptive mutation rate allowed the genetic algorithm to find the global optimal solution, performing significantly better than other search methods, including genetic algorithms that implement fixed mutation rates.

  1. An approach based on genetic algorithms with coding in real for the solution of a DC OPF to hydrothermal systems; Uma abordagem baseada em algoritmos geneticos com codificacao em real para a solucao de um FPO DC para sistemas hidrotermicos

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, Diego R.; Silva, Alessandro L. da; Luciano, Edson Jose Rezende; Nepomuceno, Leonardo [Universidade Estadual Paulista (UNESP), Bauru, SP (Brazil). Dept. de Engenharia Eletrica], Emails: diego_eng.eletricista@hotmail.com, alessandrolopessilva@uol.com.br, edson.joserl@uol.com.br, leo@feb.unesp.br

    2009-07-01

    Problems of DC Optimal Power Flow (OPF) have been solved by various conventional optimization methods. When the modeling of DC OPF involves discontinuous functions or not differentiable, the use of solution methods based on conventional optimization is often not possible because of the difficulty in calculating the gradient vectors at points of discontinuity/non-differentiability of these functions. This paper proposes a method for solving the DC OPF based on Genetic Algorithms (GA) with real coding. The proposed GA has specific genetic operators to improve the quality and viability of the solution. The results are analyzed for an IEEE test system, and its solutions are compared, when possible, with those obtained by a method of interior point primal-dual logarithmic barrier. The results highlight the robustness of the method and feasibility of obtaining the solution to real systems.

  2. A Distributed Approach to System-Level Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Bregon, Anibal; Roychoudhury, Indranil

    2012-01-01

    Prognostics, which deals with predicting remaining useful life of components, subsystems, and systems, is a key technology for systems health management that leads to improved safety and reliability with reduced costs. The prognostics problem is often approached from a component-centric view. However, in most cases, it is not specifically component lifetimes that are important, but, rather, the lifetimes of the systems in which these components reside. The system-level prognostics problem can be quite difficult due to the increased scale and scope of the prognostics problem and the relative Jack of scalability and efficiency of typical prognostics approaches. In order to address these is ues, we develop a distributed solution to the system-level prognostics problem, based on the concept of structural model decomposition. The system model is decomposed into independent submodels. Independent local prognostics subproblems are then formed based on these local submodels, resul ting in a scalable, efficient, and flexible distributed approach to the system-level prognostics problem. We provide a formulation of the system-level prognostics problem and demonstrate the approach on a four-wheeled rover simulation testbed. The results show that the system-level prognostics problem can be accurately and efficiently solved in a distributed fashion.

  3. Genetic diversity of the HLA-G coding region in Amerindian populations from the Brazilian Amazon: a possible role of natural selection.

    Science.gov (United States)

    Mendes-Junior, C T; Castelli, E C; Meyer, D; Simões, A L; Donadi, E A

    2013-12-01

    HLA-G has an important role in the modulation of the maternal immune system during pregnancy, and evidence that balancing selection acts in the promoter and 3'UTR regions has been previously reported. To determine whether selection acts on the HLA-G coding region in the Amazon Rainforest, exons 2, 3 and 4 were analyzed in a sample of 142 Amerindians from nine villages of five isolated tribes that inhabit the Central Amazon. Six previously described single-nucleotide polymorphisms (SNPs) were identified and the Expectation-Maximization (EM) and PHASE algorithms were used to computationally reconstruct SNP haplotypes (HLA-G alleles). A new HLA-G allele, which originated in Amerindian populations by a crossing-over event between two widespread HLA-G alleles, was identified in 18 individuals. Neutrality tests evidenced that natural selection has a complex part in the HLA-G coding region. Although balancing selection is the type of selection that shapes variability at a local level (Native American populations), we have also shown that purifying selection may occur on a worldwide scale. Moreover, the balancing selection does not seem to act on the coding region as strongly as it acts on the flanking regulatory regions, and such coding signature may actually reflect a hitchhiking effect.

  4. Genetic variants in promoters and coding regions of the muscle glycogen synthase and the insulin-responsive GLUT4 genes in NIDDM

    DEFF Research Database (Denmark)

    Bjørbaek, C; Echwald, Søren Morgenthaler; Hubricht, P

    1994-01-01

    To examine the hypothesis that variants in the regulatory or coding regions of the glycogen synthase (GS) and insulin-responsive glucose transporter (GLUT4) genes contribute to insulin-resistant glucose processing of muscle from non-insulin-dependent diabetes mellitus (NIDDM) patients, promoter...... volunteers. By applying inverse polymerase chain reaction and direct DNA sequencing, 532 base pairs (bp) of the GS promoter were identified and the transcriptional start site determined by primer extension. SSCP scanning of the promoter region detected five single nucleotide substitutions, positioned at 42......'-untranslated region, and the coding region of the GLUT4 gene showed four polymorphisms, all single nucleotide substitutions, positioned at -581, 1, 30, and 582. None of the three changes in the regulatory region of the gene had any major influence on expression of the GLUT4 gene in muscle. The variant at 582...

  5. Design for testability and diagnosis at the system-level

    Science.gov (United States)

    Simpson, William R.; Sheppard, John W.

    1993-01-01

    The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.

  6. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  7. System-level modeling of acetone-butanol-ethanol fermentation.

    Science.gov (United States)

    Liao, Chen; Seo, Seung-Oh; Lu, Ting

    2016-05-01

    Acetone-butanol-ethanol (ABE) fermentation is a metabolic process of clostridia that produces bio-based solvents including butanol. It is enabled by an underlying metabolic reaction network and modulated by cellular gene regulation and environmental cues. Mathematical modeling has served as a valuable strategy to facilitate the understanding, characterization and optimization of this process. In this review, we highlight recent advances in system-level, quantitative modeling of ABE fermentation. We begin with an overview of integrative processes underlying the fermentation. Next we survey modeling efforts including early simple models, models with a systematic metabolic description, and those incorporating metabolism through simple gene regulation. Particular focus is given to a recent system-level model that integrates the metabolic reactions, gene regulation and environmental cues. We conclude by discussing the remaining challenges and future directions towards predictive understanding of ABE fermentation. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. System-level Modeling of Wireless Integrated Sensor Networks

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Hansen, Knud; Madsen, Jan

    2005-01-01

    Wireless integrated sensor networks have emerged as a promising infrastructure for a new generation of monitoring and tracking applications. In order to efficiently utilize the extremely limited resources of wireless sensor nodes, accurate modeling of the key aspects of wireless sensor networks...... is necessary so that system-level design decisions can be made about the hardware and the software (applications and real-time operating system) architecture of sensor nodes. In this paper, we present a SystemC-based abstract modeling framework that enables system-level modeling of sensor network behavior...... by modeling the applications, real-time operating system, sensors, processor, and radio transceiver at the sensor node level and environmental phenomena, including radio signal propagation, at the sensor network level. We demonstrate the potential of our modeling framework by simulating and analyzing a small...

  9. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  10. System level modeling and component level control of fuel cells

    Science.gov (United States)

    Xue, Xingjian

    This dissertation investigates the fuel cell systems and the related technologies in three aspects: (1) system-level dynamic modeling of both PEM fuel cell (PEMFC) and solid oxide fuel cell (SOFC); (2) condition monitoring scheme development of PEM fuel cell system using model-based statistical method; and (3) strategy and algorithm development of precision control with potential application in energy systems. The dissertation first presents a system level dynamic modeling strategy for PEM fuel cells. It is well known that water plays a critical role in PEM fuel cell operations. It makes the membrane function appropriately and improves the durability. The low temperature operating conditions, however, impose modeling difficulties in characterizing the liquid-vapor two phase change phenomenon, which becomes even more complex under dynamic operating conditions. This dissertation proposes an innovative method to characterize this phenomenon, and builds a comprehensive model for PEM fuel cell at the system level. The model features the complete characterization of multi-physics dynamic coupling effects with the inclusion of dynamic phase change. The model is validated using Ballard stack experimental result from open literature. The system behavior and the internal coupling effects are also investigated using this model under various operating conditions. Anode-supported tubular SOFC is also investigated in the dissertation. While the Nernst potential plays a central role in characterizing the electrochemical performance, the traditional Nernst equation may lead to incorrect analysis results under dynamic operating conditions due to the current reverse flow phenomenon. This dissertation presents a systematic study in this regard to incorporate a modified Nernst potential expression and the heat/mass transfer into the analysis. The model is used to investigate the limitations and optimal results of various operating conditions; it can also be utilized to perform the

  11. Decoding the codes: A content analysis of the news coverage of genetic cloning by three online news sites and three national daily newspapers, 1996 through 1998

    Science.gov (United States)

    Hyde, Jon E.

    This study compared news coverage of genetic cloning research in three online news sites (CNN.com, ABC.com, and MSNBC.com) and three national daily newspapers (The New York Times, The Washington Post, and USA Today). The study involved the analysis of 230 online and print news articles concerning genetic cloning published from 1996 through 1998. Articles were examined with respect to formats, sources, focus, tone, and assessments about the impact of cloning research. Findings indicated that while print news formats remained relatively constant for the duration of this study, online news formats changed significantly with respect to the kinds of media used to represent the news, the layouts used to represent cloning news, and the emphasis placed on audio-visual content. Online stories were as much as 20 to 70% shorter than print stories. More than 50% of the articles appearing online were composed by outside sources (wire services, guest columnists, etc.). By comparison, nearly 90% of the articles published by print newspapers were written "in-house" by science reporters. Online news sites cited fewer sources and cited a smaller variety of sources than the newspapers examined here. In both news outlets, however, the sources most frequently cited were those with vested interests in furthering cloning research. Both online and print news coverage of cloning tends to focus principally on the technical procedures and on the future benefits of cloning. More than 60% of the articles focused on the techniques and technologies of cloning. Less than 25% of the articles focused on social, ethical, or legal issues associated with cloning. Similarly, articles from all six sources (75%) tended to be both positive and future-oriented. Less than 5% of the total articles examined here had a strongly negative or critical tone. Moreover, both online and print news sources increasingly conveyed a strong sense of acceptance about the possibility of human cloning. Data from this study

  12. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  13. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  14. Genetic classes and genetic categories : Protecting genetic groups through data protection law

    NARCIS (Netherlands)

    Hallinan, Dara; de Hert, Paul; Taylor, L.; Floridi, L.; van der Sloot, B.

    2017-01-01

    Each person shares genetic code with others. Thus, one individual’s genome can reveal information about other individuals. When multiple individuals share aspects of genetic architecture, they form a ‘genetic group’. From a social and legal perspective, two types of genetic group exist: Those which

  15. A two warehouse deterministic inventory model for deteriorating items with a linear trend in time dependent demand over finite time horizon by Elitist Real-Coded Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    A.K. Bhunia

    2013-04-01

    Full Text Available This paper deals with a deterministic inventory model developed for deteriorating items having two separate storage facilities (owned and rented warehouses due to limited capacity of the existing storage (owned warehouse with linear time dependent demand (increasing over a fixed finite time horizon. The model is formulated with infinite replenishment and the successive replenishment cycle lengths are in arithmetic progression. Partially backlogged shortages are allowed. The stocks of rented warehouse (RW are transported to the owned warehouse (OW in continuous release pattern. For this purpose, the model is formulated as a constrained non-linear mixed integer programming problem. For solving the problem, an advanced genetic algorithm (GA has been developed. This advanced GA is based on ranking selection, elitism, whole arithmetic crossover and non-uniform mutation dependent on the age of the population. Our objective is to determine the optimal replenishment number, lot-size of two-warehouses (OW and RW by maximizing the profit function. The model is illustrated with four numerical examples and sensitivity analyses of the optimal solution are performed with respect to different parameters.

  16. A systems-level approach for investigating organophosphorus pesticide toxicity.

    Science.gov (United States)

    Zhu, Jingbo; Wang, Jing; Ding, Yan; Liu, Baoyue; Xiao, Wei

    2018-03-01

    The full understanding of the single and joint toxicity of a variety of organophosphorus (OP) pesticides is still unavailable, because of the extreme complex mechanism of action. This study established a systems-level approach based on systems toxicology to investigate OP pesticide toxicity by incorporating ADME/T properties, protein prediction, and network and pathway analysis. The results showed that most OP pesticides are highly toxic according to the ADME/T parameters, and can interact with significant receptor proteins to cooperatively lead to various diseases by the established OP pesticide -protein and protein-disease networks. Furthermore, the studies that multiple OP pesticides potentially act on the same receptor proteins and/or the functionally diverse proteins explained that multiple OP pesticides could mutually enhance toxicological synergy or additive on a molecular/systematic level. To the end, the integrated pathways revealed the mechanism of toxicity of the interaction of OP pesticides and elucidated the pathogenesis induced by OP pesticides. This study demonstrates a systems-level approach for investigating OP pesticide toxicity that can be further applied to risk assessments of various toxins, which is of significant interest to food security and environmental protection. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. MEDICARE PAYMENTS AND SYSTEM-LEVEL HEALTH-CARE USE

    Science.gov (United States)

    ROBBINS, JACOB A.

    2015-01-01

    The rapid growth of Medicare managed care over the past decade has the potential to increase the efficiency of health-care delivery. Improvements in care management for some may improve efficiency system-wide, with implications for optimal payment policy in public insurance programs. These system-level effects may depend on local health-care market structure and vary based on patient characteristics. We use exogenous variation in the Medicare payment schedule to isolate the effects of market-level managed care enrollment on the quantity and quality of care delivered. We find that in areas with greater enrollment of Medicare beneficiaries in managed care, the non–managed care beneficiaries have fewer days in the hospital but more outpatient visits, consistent with a substitution of less expensive outpatient care for more expensive inpatient care, particularly at high levels of managed care. We find no evidence that care is of lower quality. Optimal payment policies for Medicare managed care enrollees that account for system-level spillovers may thus be higher than those that do not. PMID:27042687

  18. Measuring healthcare productivity - from unit to system level.

    Science.gov (United States)

    Kämäräinen, Vesa Johannes; Peltokorpi, Antti; Torkki, Paulus; Tallbacka, Kaj

    2016-04-18

    Purpose - Healthcare productivity is a growing issue in most Western countries where healthcare expenditure is rapidly increasing. Therefore, accurate productivity metrics are essential to avoid sub-optimization within a healthcare system. The purpose of this paper is to focus on healthcare production system productivity measurement. Design/methodology/approach - Traditionally, healthcare productivity has been studied and measured independently at the unit, organization and system level. Suggesting that productivity measurement should be done in different levels, while simultaneously linking productivity measurement to incentives, this study presents the challenges of productivity measurement at the different levels. The study introduces different methods to measure productivity in healthcare. In addition, it provides background information on the methods used to measure productivity and the parameters used in these methods. A pilot investigation of productivity measurement is used to illustrate the challenges of measurement, to test the developed measures and to prove the practical information for managers. Findings - The study introduces different approaches and methods to measure productivity in healthcare. Practical implications - A pilot investigation of productivity measurement is used to illustrate the challenges of measurement, to test the developed measures and to prove the practical benefits for managers. Originality/value - The authors focus on the measurement of the whole healthcare production system and try to avoid sub-optimization. Additionally considering an individual patient approach, productivity measurement is examined at the unit level, the organizational level and the system level.

  19. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  20. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  1. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  2. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  3. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  4. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  5. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  6. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  7. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  8. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  9. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    International Nuclear Information System (INIS)

    D.L. McGregor

    2000-01-01

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process

  10. System-level techniques for analog performance enhancement

    CERN Document Server

    Song, Bang-Sup

    2016-01-01

    This book shows readers to avoid common mistakes in circuit design, and presents classic circuit concepts and design approaches from the transistor to the system levels. The discussion is geared to be accessible and optimized for practical designers who want to learn to create circuits without simulations. Topic by topic, the author guides designers to learn the classic analog design skills by understanding the basic electronics principles correctly, and further prepares them to feel confident in designing high-performance, state-of-the art CMOS analog systems. This book combines and presents all in-depth necessary information to perform various design tasks so that readers can grasp essential material, without reading through the entire book. This top-down approach helps readers to build practical design expertise quickly, starting from their understanding of electronics fundamentals. .

  11. Process for Selecting System Level Assessments for Human System Technologies

    Science.gov (United States)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  12. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  13. System-level integration of active silicon photonic biosensors

    Science.gov (United States)

    Laplatine, L.; Al'Mrayat, O.; Luan, E.; Fang, C.; Rezaiezadeh, S.; Ratner, D. M.; Cheung, K.; Dattner, Y.; Chrostowski, L.

    2017-02-01

    Biosensors based on silicon photonic integrated circuits have attracted a growing interest in recent years. The use of sub-micron silicon waveguides to propagate near-infrared light allows for the drastic reduction of the optical system size, while increasing its complexity and sensitivity. Using silicon as the propagating medium also leverages the fabrication capabilities of CMOS foundries, which offer low-cost mass production. Researchers have deeply investigated photonic sensor devices, such as ring resonators, interferometers and photonic crystals, but the practical integration of silicon photonic biochips as part of a complete system has received less attention. Herein, we present a practical system-level architecture which can be employed to integrate the aforementioned photonic biosensors. We describe a system based on 1 mm2 dies that integrate germanium photodetectors and a single light coupling device. The die are embedded into a 16x16 mm2 epoxy package to enable microfluidic and electrical integration. First, we demonstrate a simple process to mimic Fan-Out Wafer-level-Packaging, which enables low-cost mass production. We then characterize the photodetectors in the photovoltaic mode, which exhibit high sensitivity at low optical power. Finally, we present a new grating coupler concept to relax the lateral alignment tolerance down to +/- 50 μm at 1-dB (80%) power penalty, which should permit non-experts to use the biochips in a"plug-and-play" style. The system-level integration demonstrated in this study paves the way towards the mass production of low-cost and highly sensitive biosensors, and can facilitate their wide adoption for biomedical and agro-environmental applications.

  14. Public health preparedness in Alberta: a systems-level study.

    Science.gov (United States)

    Moore, Douglas; Shiell, Alan; Noseworthy, Tom; Russell, Margaret; Predy, Gerald

    2006-12-28

    Recent international and national events have brought critical attention to the Canadian public health system and how prepared the system is to respond to various types of contemporary public health threats. This article describes the study design and methods being used to conduct a systems-level analysis of public health preparedness in the province of Alberta, Canada. The project is being funded under the Health Research Fund, Alberta Heritage Foundation for Medical Research. We use an embedded, multiple-case study design, integrating qualitative and quantitative methods to measure empirically the degree of inter-organizational coordination existing among public health agencies in Alberta, Canada. We situate our measures of inter-organizational network ties within a systems-level framework to assess the relative influence of inter-organizational ties, individual organizational attributes, and institutional environmental features on public health preparedness. The relative contribution of each component is examined for two potential public health threats: pandemic influenza and West Nile virus. The organizational dimensions of public health preparedness depend on a complex mix of individual organizational characteristics, inter-agency relationships, and institutional environmental factors. Our study is designed to discriminate among these different system components and assess the independent influence of each on the other, as well as the overall level of public health preparedness in Alberta. While all agree that competent organizations and functioning networks are important components of public health preparedness, this study is one of the first to use formal network analysis to study the role of inter-agency networks in the development of prepared public health systems.

  15. Promoting system-level learning from project-level lessons

    International Nuclear Information System (INIS)

    Jong, Amos A. de; Runhaar, Hens A.C.; Runhaar, Piety R.; Kolhoff, Arend J.; Driessen, Peter P.J.

    2012-01-01

    A growing number of low and middle income nations (LMCs) have adopted some sort of system for environmental impact assessment (EIA). However, generally many of these EIA systems are characterised by a low performance in terms of timely information dissemination, monitoring and enforcement after licencing. Donor actors (such as the World Bank) have attempted to contribute to a higher performance of EIA systems in LMCs by intervening at two levels: the project level (e.g. by providing scoping advice or EIS quality review) and the system level (e.g. by advising on EIA legislation or by capacity building). The aims of these interventions are environmental protection in concrete cases and enforcing the institutionalisation of environmental protection, respectively. Learning by actors involved is an important condition for realising these aims. A relatively underexplored form of learning concerns learning at EIA system-level via project level donor interventions. This ‘indirect’ learning potentially results in system changes that better fit the specific context(s) and hence contribute to higher performances. Our exploratory research in Ghana and the Maldives shows that thus far, ‘indirect’ learning only occurs incidentally and that donors play a modest role in promoting it. Barriers to indirect learning are related to the institutional context rather than to individual characteristics. Moreover, ‘indirect’ learning seems to flourish best in large projects where donors achieved a position of influence that they can use to evoke reflection upon system malfunctions. In order to enhance learning at all levels donors should thereby present the outcomes of the intervention elaborately (i.e. discuss the outcomes with a large audience), include practical suggestions about post-EIS activities such as monitoring procedures and enforcement options and stimulate the use of their advisory reports to generate organisational memory and ensure a better information

  16. Public health preparedness in Alberta: a systems-level study

    Directory of Open Access Journals (Sweden)

    Noseworthy Tom

    2006-12-01

    Full Text Available Abstract Background Recent international and national events have brought critical attention to the Canadian public health system and how prepared the system is to respond to various types of contemporary public health threats. This article describes the study design and methods being used to conduct a systems-level analysis of public health preparedness in the province of Alberta, Canada. The project is being funded under the Health Research Fund, Alberta Heritage Foundation for Medical Research. Methods/Design We use an embedded, multiple-case study design, integrating qualitative and quantitative methods to measure empirically the degree of inter-organizational coordination existing among public health agencies in Alberta, Canada. We situate our measures of inter-organizational network ties within a systems-level framework to assess the relative influence of inter-organizational ties, individual organizational attributes, and institutional environmental features on public health preparedness. The relative contribution of each component is examined for two potential public health threats: pandemic influenza and West Nile virus. Discussion The organizational dimensions of public health preparedness depend on a complex mix of individual organizational characteristics, inter-agency relationships, and institutional environmental factors. Our study is designed to discriminate among these different system components and assess the independent influence of each on the other, as well as the overall level of public health preparedness in Alberta. While all agree that competent organizations and functioning networks are important components of public health preparedness, this study is one of the first to use formal network analysis to study the role of inter-agency networks in the development of prepared public health systems.

  17. Promoting system-level learning from project-level lessons

    Energy Technology Data Exchange (ETDEWEB)

    Jong, Amos A. de, E-mail: amosdejong@gmail.com [Innovation Management, Utrecht (Netherlands); Runhaar, Hens A.C., E-mail: h.a.c.runhaar@uu.nl [Section of Environmental Governance, Utrecht University, Utrecht (Netherlands); Runhaar, Piety R., E-mail: piety.runhaar@wur.nl [Organisational Psychology and Human Resource Development, University of Twente, Enschede (Netherlands); Kolhoff, Arend J., E-mail: Akolhoff@eia.nl [The Netherlands Commission for Environmental Assessment, Utrecht (Netherlands); Driessen, Peter P.J., E-mail: p.driessen@geo.uu.nl [Department of Innovation and Environment Sciences, Utrecht University, Utrecht (Netherlands)

    2012-02-15

    A growing number of low and middle income nations (LMCs) have adopted some sort of system for environmental impact assessment (EIA). However, generally many of these EIA systems are characterised by a low performance in terms of timely information dissemination, monitoring and enforcement after licencing. Donor actors (such as the World Bank) have attempted to contribute to a higher performance of EIA systems in LMCs by intervening at two levels: the project level (e.g. by providing scoping advice or EIS quality review) and the system level (e.g. by advising on EIA legislation or by capacity building). The aims of these interventions are environmental protection in concrete cases and enforcing the institutionalisation of environmental protection, respectively. Learning by actors involved is an important condition for realising these aims. A relatively underexplored form of learning concerns learning at EIA system-level via project level donor interventions. This 'indirect' learning potentially results in system changes that better fit the specific context(s) and hence contribute to higher performances. Our exploratory research in Ghana and the Maldives shows that thus far, 'indirect' learning only occurs incidentally and that donors play a modest role in promoting it. Barriers to indirect learning are related to the institutional context rather than to individual characteristics. Moreover, 'indirect' learning seems to flourish best in large projects where donors achieved a position of influence that they can use to evoke reflection upon system malfunctions. In order to enhance learning at all levels donors should thereby present the outcomes of the intervention elaborately (i.e. discuss the outcomes with a large audience), include practical suggestions about post-EIS activities such as monitoring procedures and enforcement options and stimulate the use of their advisory reports to generate organisational memory and ensure a better

  18. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  19. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  20. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  1. A Systems-Level Approach to Characterizing Effects of ENMs ...

    Science.gov (United States)

    Engineered nanomaterials (ENMs) represent a new regulatory challenge because of their unique properties and their potential to interact with ecological organisms at various developmental stages, in numerous environmental compartments. Traditional toxicity tests have proven to be unreliable due to their short-term nature and the subtle responses often observed following ENM exposure. In order to fully assess the potential for various ENMs to affect responses in organisms and ecosystems, we are using a systems-level framework to link molecular initiating events with changes in whole-organism responses, and to identify how these changes may translate across scales to disrupt important ecosystem processes. This framework utilizes information from nanoparticle characteristics and exposures to help make linkages across scales. We have used Arabidopsis thaliana as a model organism to identify potential transcriptome changes in response to specific ENMs. In addition, we have focused on plant species of agronomic importance to follow multi-generational changes in physiology and phenology, as well as epigenetic markers to identify possible mechanisms of inheritance. We are employing and developing complementary analytical tools (plasma-based and synchrotron spectroscopies, microscopy, and molecular and stable-isotopic techniques) to follow movement of ENMs and ENM products in plants as they develop. These studies have revealed that changes in gene expression do not a

  2. System level traffic shaping in disk servers with heterogeneous protocols

    International Nuclear Information System (INIS)

    Cano, Eric; Kruse, Daniele Francesco

    2014-01-01

    Disk access and tape migrations compete for network bandwidth in CASTORs disk servers, over various protocols: RFIO, Xroot, root and GridFTP. As there are a limited number of tape drives, it is important to keep them busy all the time, at their nominal speed. With potentially 100s of user read streams per server, the bandwidth for the tape migrations has to be guaranteed to a controlled level, and not the fair share the system gives by default. Xroot provides a prioritization mechanism, but using it implies moving exclusively to the Xroot protocol, which is not possible in short to mid-term time frame, as users are equally using all protocols. The greatest commonality of all those protocols is not more than the usage of TCP/IP. We investigated the Linux kernel traffic shaper to control TCP/ IP bandwidth. The performance and limitations of the traffic shaper have been understood in test environment, and satisfactory working point has been found for production. Notably, TCP offload engines' negative impact on traffic shaping, and the limitations of the length of the traffic shaping rules were discovered and measured. A suitable working point has been found and the traffic shaping is now successfully deployed in the CASTOR production systems at CERN. This system level approach could be transposed easily to other environments.

  3. A system-level model for the microbial regulatory genome.

    Science.gov (United States)

    Brooks, Aaron N; Reiss, David J; Allard, Antoine; Wu, Wei-Ju; Salvanha, Diego M; Plaisier, Christopher L; Chandrasekaran, Sriram; Pan, Min; Kaur, Amardeep; Baliga, Nitin S

    2014-07-15

    Microbes can tailor transcriptional responses to diverse environmental challenges despite having streamlined genomes and a limited number of regulators. Here, we present data-driven models that capture the dynamic interplay of the environment and genome-encoded regulatory programs of two types of prokaryotes: Escherichia coli (a bacterium) and Halobacterium salinarum (an archaeon). The models reveal how the genome-wide distributions of cis-acting gene regulatory elements and the conditional influences of transcription factors at each of those elements encode programs for eliciting a wide array of environment-specific responses. We demonstrate how these programs partition transcriptional regulation of genes within regulons and operons to re-organize gene-gene functional associations in each environment. The models capture fitness-relevant co-regulation by different transcriptional control mechanisms acting across the entire genome, to define a generalized, system-level organizing principle for prokaryotic gene regulatory networks that goes well beyond existing paradigms of gene regulation. An online resource (http://egrin2.systemsbiology.net) has been developed to facilitate multiscale exploration of conditional gene regulation in the two prokaryotes. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  4. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  5. Estimating yield gaps at the cropping system level.

    Science.gov (United States)

    Guilpart, Nicolas; Grassini, Patricio; Sadras, Victor O; Timsina, Jagadish; Cassman, Kenneth G

    2017-05-01

    Yield gap analyses of individual crops have been used to estimate opportunities for increasing crop production at local to global scales, thus providing information crucial to food security. However, increases in crop production can also be achieved by improving cropping system yield through modification of spatial and temporal arrangement of individual crops. In this paper we define the cropping system yield potential as the output from the combination of crops that gives the highest energy yield per unit of land and time, and the cropping system yield gap as the difference between actual energy yield of an existing cropping system and the cropping system yield potential. Then, we provide a framework to identify alternative cropping systems which can be evaluated against the current ones. A proof-of-concept is provided with irrigated rice-maize systems at four locations in Bangladesh that represent a range of climatic conditions in that country. The proposed framework identified (i) realistic alternative cropping systems at each location, and (ii) two locations where expected improvements in crop production from changes in cropping intensity (number of crops per year) were 43% to 64% higher than from improving the management of individual crops within the current cropping systems. The proposed framework provides a tool to help assess food production capacity of new systems ( e.g. with increased cropping intensity) arising from climate change, and assess resource requirements (water and N) and associated environmental footprint per unit of land and production of these new systems. By expanding yield gap analysis from individual crops to the cropping system level and applying it to new systems, this framework could also be helpful to bridge the gap between yield gap analysis and cropping/farming system design.

  6. Quantum algorithms and the genetic code

    Indian Academy of Sciences (India)

    Replication of DNA and synthesis of proteins are studied from the view-point of quantum database search. Identification of a base-pairing with a quantum query gives a natural (and first ever!) explanation of why living organisms have 4 nucleotide bases and 20 amino acids. It is amazing that these numbers arise as ...

  7. George Gamow and the Genetic Code

    Indian Academy of Sciences (India)

    cause they were held together by hydrogen bonds formed be- tween adenine and ... To return to our story, on the 8th of July Gamow addressed a letter to Watson and ... "For example, the animal will be a cat if Adenine is always followed by ...

  8. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  9. The next generation in optical transport semiconductors: IC solutions at the system level

    Science.gov (United States)

    Gomatam, Badri N.

    2005-02-01

    In this tutorial overview, we survey some of the challenging problems facing Optical Transport and their solutions using new semiconductor-based technologies. Advances in 0.13um CMOS, SiGe/HBT and InP/HBT IC process technologies and mixed-signal design strategies are the fundamental breakthroughs that have made these solutions possible. In combination with innovative packaging and transponder/transceiver architectures IC approaches have clearly demonstrated enhanced optical link budgets with simultaneously lower (perhaps the lowest to date) cost and manufacturability tradeoffs. This paper will describe: *Electronic Dispersion Compensation broadly viewed as the overcoming of dispersion based limits to OC-192 links and extending link budgets, *Error Control/Coding also known as Forward Error Correction (FEC), *Adaptive Receivers for signal quality monitoring for real-time estimation of Q/OSNR, eye-pattern, signal BER and related temporal statistics (such as jitter). We will discuss the theoretical underpinnings of these receiver and transmitter architectures, provide examples of system performance and conclude with general market trends. These Physical layer IC solutions represent a fundamental new toolbox of options for equipment designers in addressing systems level problems. With unmatched cost and yield/performance tradeoffs, it is expected that IC approaches will provide significant flexibility in turn, for carriers and service providers who must ultimately manage the network and assure acceptable quality of service under stringent cost constraints.

  10. System Level Design of Reconfigurable Server Farms Using Elliptic Curve Cryptography Processor Engines

    Directory of Open Access Journals (Sweden)

    Sangook Moon

    2014-01-01

    Full Text Available As today’s hardware architecture becomes more and more complicated, it is getting harder to modify or improve the microarchitecture of a design in register transfer level (RTL. Consequently, traditional methods we have used to develop a design are not capable of coping with complex designs. In this paper, we suggest a way of designing complex digital logic circuits with a soft and advanced type of SystemVerilog at an electronic system level. We apply the concept of design-and-reuse with a high level of abstraction to implement elliptic curve crypto-processor server farms. With the concept of the superior level of abstraction to the RTL used with the traditional HDL design, we successfully achieved the soft implementation of the crypto-processor server farms as well as robust test bench code with trivial effort in the same simulation environment. Otherwise, it could have required error-prone Verilog simulations for the hardware IPs and other time-consuming jobs such as C/SystemC verification for the software, sacrificing more time and effort. In the design of the elliptic curve cryptography processor engine, we propose a 3X faster GF(2m serial multiplication architecture.

  11. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  12. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  13. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  14. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  15. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  16. Foundations of genetic algorithms 1991

    CERN Document Server

    1991-01-01

    Foundations of Genetic Algorithms 1991 (FOGA 1) discusses the theoretical foundations of genetic algorithms (GA) and classifier systems.This book compiles research papers on selection and convergence, coding and representation, problem hardness, deception, classifier system design, variation and recombination, parallelization, and population divergence. Other topics include the non-uniform Walsh-schema transform; spurious correlations and premature convergence in genetic algorithms; and variable default hierarchy separation in a classifier system. The grammar-based genetic algorithm; condition

  17. Female mating preferences determine system-level evolution in a gene network model.

    Science.gov (United States)

    Fierst, Janna L

    2013-06-01

    Environmental patterns of directional, stabilizing and fluctuating selection can influence the evolution of system-level properties like evolvability and mutational robustness. Intersexual selection produces strong phenotypic selection and these dynamics may also affect the response to mutation and the potential for future adaptation. In order to to assess the influence of mating preferences on these evolutionary properties, I modeled a male trait and female preference determined by separate gene regulatory networks. I studied three sexual selection scenarios: sexual conflict, a Gaussian model of the Fisher process described in Lande (in Proc Natl Acad Sci 78(6):3721-3725, 1981) and a good genes model in which the male trait signalled his mutational condition. I measured the effects these mating preferences had on the potential for traits and preferences to evolve towards new states, and mutational robustness of both the phenotype and the individual's overall viability. All types of sexual selection increased male phenotypic robustness relative to a randomly mating population. The Fisher model also reduced male evolvability and mutational robustness for viability. Under good genes sexual selection, males evolved an increased mutational robustness for viability. Females choosing their mates is a scenario that is sufficient to create selective forces that impact genetic evolution and shape the evolutionary response to mutation and environmental selection. These dynamics will inevitably develop in any population where sexual selection is operating, and affect the potential for future adaptation.

  18. Striatal response to reward anticipation: evidence for a systems-level intermediate phenotype for schizophrenia.

    Science.gov (United States)

    Grimm, Oliver; Heinz, Andreas; Walter, Henrik; Kirsch, Peter; Erk, Susanne; Haddad, Leila; Plichta, Michael M; Romanczuk-Seiferth, Nina; Pöhland, Lydia; Mohnke, Sebastian; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Schäfer, Axel; Cichon, Sven; Nöthen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2014-05-01

    Attenuated ventral striatal response during reward anticipation is a core feature of schizophrenia that is seen in prodromal, drug-naive, and chronic schizophrenic patients. Schizophrenia is highly heritable, raising the possibility that this phenotype is related to the genetic risk for the disorder. To examine a large sample of healthy first-degree relatives of schizophrenic patients and compare their neural responses to reward anticipation with those of carefully matched controls without a family psychiatric history. To further support the utility of this phenotype, we studied its test-retest reliability, its potential brain structural contributions, and the effects of a protective missense variant in neuregulin 1 (NRG1) linked to schizophrenia by meta-analysis (ie, rs10503929). Examination of a well-established monetary reward anticipation paradigm during functional magnetic resonance imaging at a university hospital; voxel-based morphometry; test-retest reliability analysis of striatal activations in an independent sample of 25 healthy participants scanned twice with the same task; and imaging genetics analysis of the control group. A total of 54 healthy first-degree relatives of schizophrenic patients and 80 controls matched for demographic, psychological, clinical, and task performance characteristics were studied. Blood oxygen level-dependent response during reward anticipation, analysis of intraclass correlations of functional contrasts, and associations between striatal gray matter volume and NRG1 genotype. Compared with controls, healthy first-degree relatives showed a highly significant decrease in ventral striatal activation during reward anticipation (familywise error-corrected P systems-level functional phenotype is reliable (with intraclass correlation coefficients of 0.59-0.73), independent of local gray matter volume (with no corresponding group differences and no correlation to function, and with all uncorrected P values >.05), and affected by

  19. Allele coding in genomic evaluation

    DEFF Research Database (Denmark)

    Standen, Ismo; Christensen, Ole Fredslund

    2011-01-01

    Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker...... effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous...... genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call...

  20. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  1. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  2. Hypothesis of Lithocoding: Origin of the Genetic Code as a "Double Jigsaw Puzzle" of Nucleobase-Containing Molecules and Amino Acids Assembled by Sequential Filling of Apatite Mineral Cellules.

    Science.gov (United States)

    Skoblikow, Nikolai E; Zimin, Andrei A

    2016-05-01

    The hypothesis of direct coding, assuming the direct contact of pairs of coding molecules with amino acid side chains in hollow unit cells (cellules) of a regular crystal-structure mineral is proposed. The coding nucleobase-containing molecules in each cellule (named "lithocodon") partially shield each other; the remaining free space determines the stereochemical character of the filling side chain. Apatite-group minerals are considered as the most preferable for this type of coding (named "lithocoding"). A scheme of the cellule with certain stereometric parameters, providing for the isomeric selection of contacting molecules is proposed. We modelled the filling of cellules with molecules involved in direct coding, with the possibility of coding by their single combination for a group of stereochemically similar amino acids. The regular ordered arrangement of cellules enables the polymerization of amino acids and nucleobase-containing molecules in the same direction (named "lithotranslation") preventing the shift of coding. A table of the presumed "LithoCode" (possible and optimal lithocodon assignments for abiogenically synthesized α-amino acids involved in lithocoding and lithotranslation) is proposed. The magmatic nature of the mineral, abiogenic synthesis of organic molecules and polymerization events are considered within the framework of the proposed "volcanic scenario".

  3. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  4. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  5. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  6. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  7. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  8. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  9. Contributions of dopamine-related genes and environmental factors to highly sensitive personality: a multi-step neuronal system-level approach.

    Directory of Open Access Journals (Sweden)

    Chunhui Chen

    Full Text Available Traditional behavioral genetic studies (e.g., twin, adoption studies have shown that human personality has moderate to high heritability, but recent molecular behavioral genetic studies have failed to identify quantitative trait loci (QTL with consistent effects. The current study adopted a multi-step approach (ANOVA followed by multiple regression and permutation to assess the cumulative effects of multiple QTLs. Using a system-level (dopamine system genetic approach, we investigated a personality trait deeply rooted in the nervous system (the Highly Sensitive Personality, HSP. 480 healthy Chinese college students were given the HSP scale and genotyped for 98 representative polymorphisms in all major dopamine neurotransmitter genes. In addition, two environment factors (stressful life events and parental warmth that have been implicated for their contributions to personality development were included to investigate their relative contributions as compared to genetic factors. In Step 1, using ANOVA, we identified 10 polymorphisms that made statistically significant contributions to HSP. In Step 2, these polymorphism's main effects and interactions were assessed using multiple regression. This model accounted for 15% of the variance of HSP (p<0.001. Recent stressful life events accounted for an additional 2% of the variance. Finally, permutation analyses ascertained the probability of obtaining these findings by chance to be very low, p ranging from 0.001 to 0.006. Dividing these loci by the subsystems of dopamine synthesis, degradation/transport, receptor and modulation, we found that the modulation and receptor subsystems made the most significant contribution to HSP. The results of this study demonstrate the utility of a multi-step neuronal system-level approach in assessing genetic contributions to individual differences in human behavior. It can potentially bridge the gap between the high heritability estimates based on traditional

  10. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  11. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  12. DAEDALUS: System-Level Design Methodology for Streaming Multiprocessor Embedded Systems on Chips

    NARCIS (Netherlands)

    Stefanov, T.; Pimentel, A.; Nikolov, H.; Ha, S.; Teich, J.

    2017-01-01

    The complexity of modern embedded systems, which are increasingly based on heterogeneous multiprocessor system-on-chip (MPSoC) architectures, has led to the emergence of system-level design. To cope with this design complexity, system-level design aims at raising the abstraction level of the design

  13. Power monitors: A framework for system-level power estimation using heterogeneous power models

    NARCIS (Netherlands)

    Bansal, N.; Lahiri, K.; Raghunathan, A.; Chakradhar, S.T.

    2005-01-01

    Paper analysis early in the design cycle is critical for the design of low-power systems. With the move to system-level specifications and design methodologies, there has been significant research interest in system-level power estimation. However, as demonstrated in this paper, the addition of

  14. NASA: A generic infrastructure for system-level MP-SoC design space exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Pimentel, A.D.; Thompson, M.; Bautista, T.; Núñez, A.

    2010-01-01

    System-level simulation and design space exploration (DSE) are key ingredients for the design of multiprocessor system-on-chip (MP-SoC) based embedded systems. The efforts in this area, however, typically use ad-hoc software infrastructures to facilitate and support the system-level DSE experiments.

  15. Exploiting Domain Knowledge in System-level MPSoC Design Space Exploration

    NARCIS (Netherlands)

    Thompson, M.; Pimentel, A.D.

    2013-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded multimedia systems. During system-level DSE, system parameters like, e.g., the number and type of processors, and the mapping of

  16. Homelessness Outcome Reporting Normative Framework: Systems-Level Evaluation of Progress in Ending Homelessness

    Science.gov (United States)

    Austen, Tyrone; Pauly, Bernie

    2012-01-01

    Homelessness is a serious and growing issue. Evaluations of systemic-level changes are needed to determine progress in reducing or ending homelessness. The report card methodology is one means of systems-level assessment. Rather than solely establishing an enumeration, homelessness report cards can capture pertinent information about structural…

  17. Design space pruning through hybrid analysis in system-level design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system archi- tectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size

  18. Interleaving methods for hybrid system-level MPSoC design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.; McAllister, J.; Bhattacharyya, S.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  19. Pruning techniques for multi-objective system-level design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.

    2014-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  20. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  1. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  2. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  3. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  4. Preimplantation genetic screening.

    Science.gov (United States)

    Harper, Joyce C

    2018-03-01

    Preimplantation genetic diagnosis was first successfully performed in 1989 as an alternative to prenatal diagnosis for couples at risk of transmitting a genetic or chromosomal abnormality, such as cystic fibrosis, to their child. From embryos generated in vitro, biopsied cells are genetically tested. From the mid-1990s, this technology has been employed as an embryo selection tool for patients undergoing in vitro fertilisation, screening as many chromosomes as possible, in the hope that selecting chromosomally normal embryos will lead to higher implantation and decreased miscarriage rates. This procedure, preimplantation genetic screening, was initially performed using fluorescent in situ hybridisation, but 11 randomised controlled trials of screening using this technique showed no improvement in in vitro fertilisation delivery rates. Progress in genetic testing has led to the introduction of array comparative genomic hybridisation, quantitative polymerase chain reaction, and next generation sequencing for preimplantation genetic screening, and three small randomised controlled trials of preimplantation genetic screening using these new techniques indicate a modest benefit. Other trials are still in progress but, regardless of their results, preimplantation genetic screening is now being offered globally. In the near future, it is likely that sequencing will be used to screen the full genetic code of the embryo.

  5. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  6. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  7. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  8. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  9. ASPECT (Automated System-level Performance Evaluation and Characterization Tool), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI has developed a suite of SAA tools and an analysis capability referred to as ASPECT (Automated System-level Performance Evaluation and Characterization Tool)....

  10. Systems-level organization of non-alcoholic fatty liver disease progression network

    Directory of Open Access Journals (Sweden)

    K. Shubham

    2017-10-01

    the coordination of metabolism and inflammation in NAFLD patients. We found that genes of arachidonic acid, sphingolipid and glycosphingolipid metabolism were upregulated and co-expressed with genes of proinflammatory signaling pathways and hypoxia in NASH/NASH with fibrosis. These metabolic alterations might play a role in sustaining VAT inflammation. Further, the inflammation related genes were also co-expressed with genes involved in the ECM degradation. We interlink these cellular processes to obtain a systems-level understanding of NAFLD.

  11. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  12. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  13. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  14. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  15. Genetic Mapping

    Science.gov (United States)

    ... greatly advanced genetics research. The improved quality of genetic data has reduced the time required to identify a ... cases, a matter of months or even weeks. Genetic mapping data generated by the HGP's laboratories is freely accessible ...

  16. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  17. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  18. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  19. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  20. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  1. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  2. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  3. Genetic privacy.

    Science.gov (United States)

    Sankar, Pamela

    2003-01-01

    During the past 10 years, the number of genetic tests performed more than tripled, and public concern about genetic privacy emerged. The majority of states and the U.S. government have passed regulations protecting genetic information. However, research has shown that concerns about genetic privacy are disproportionate to known instances of information misuse. Beliefs in genetic determinacy explain some of the heightened concern about genetic privacy. Discussion of the debate over genetic testing within families illustrates the most recent response to genetic privacy concerns.

  4. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  5. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  6. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  7. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  8. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  9. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  10. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  11. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  12. System-Level Design Methodologies for Networked Multiprocessor Systems-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir

    2008-01-01

    is the first such attempt in the published literature. The second part of the thesis deals with the issues related to the development of system-level design methodologies for networked multiprocessor systems-on-chip at various levels of design abstraction with special focus on the modeling and design...... at the system-level. The multiprocessor modeling framework is then extended to include models of networked multiprocessor systems-on-chip which is then employed to model wireless sensor networks both at the sensor node level as well as the wireless network level. In the third and the final part, the thesis...... to the transaction-level model. The thesis, as a whole makes contributions by describing a design methodology for networked multiprocessor embedded systems at three layers of abstraction from system-level through transaction-level to the cycle accurate level as well as demonstrating it practically by implementing...

  13. System-Level Modelling and Simulation of MEMS-Based Sensors

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Madsen, Jan; Shafique, Mohammad

    2005-01-01

    The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration with the......The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration...... with the existing embedded system design methodologies is possible. In this paper, we present a MEMS design methodology that uses VHDL-AMS based system-level model of a MEMS device as a starting point and combines the top-down and bottom-up design approaches for design, verification, and optimization...

  14. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  15. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  16. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  17. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  18. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  19. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  20. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  1. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  2. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  3. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  4. System-Level Sensitivity Analysis of SiNW-bioFET-Based Biosensing Using Lockin Amplification

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Kjærgaard, Claus

    2017-01-01

    carry out for the first time the system-level sensitivity analysis of a generic SiNW-bioFET model coupled to a custom-design instrument based on the lock-in amplifier. By investigating a large parametric space spanning over both sensor and instrumentation specifications, we demonstrate that systemwide...

  5. The Artemis workbench for system-level performance evaluation of embedded systems

    NARCIS (Netherlands)

    Pimentel, A.D.

    2008-01-01

    In this paper, we present an overview of the Artemis workbench, which provides modelling and simulation methods and tools for efficient performance evaluation and exploration of heterogeneous embedded multimedia systems. More specifically, we describe the Artemis system-level modelling methodology,

  6. A system-level modelling perspective of the KwaZulu-Natal Bight ...

    African Journals Online (AJOL)

    Requirements to take the hydrodynamic, biogeochemical and first ecosystem modelling efforts towards a meaningful predictive capability are discussed. The importance of adopting a system-level view of the bight and its connected systems for realistic exploration of global change scenarios is highlighted. Keywords: ...

  7. System-level modelling of dynamic reconfigurable designs using functional programming abstractions

    NARCIS (Netherlands)

    Uchevler, B.N.; Svarstad, Kjetil; Kuper, Jan; Baaij, C.P.R.

    With the increasing size and complexity of designs in electronics, new approaches are required for the description and verification of digital circuits, specifically at the system level. Functional HDLs can appear as an advantageous choice for formal verification and high-level descriptions. In this

  8. A Probabilistic Approach for the System-Level Design of Multi-ASIP Platforms

    DEFF Research Database (Denmark)

    Micconi, Laura

    introduce a system-level Design Space Exploration (DSE) for the very early phases of the design that automatizes part of the multi-ASIP design flow. Our DSE is responsible for assigning the tasks to the different ASIPs exploring different platform alternatives. We perform a schedulability analysis for each...

  9. System-Level Design of an Integrated Receiver Front End for a Wireless Ultrasound Probe

    DEFF Research Database (Denmark)

    di Ianni, Tommaso; Hemmsen, Martin Christian; Llimos Muntal, Pere

    2016-01-01

    In this paper, a system-level design is presented for an integrated receive circuit for a wireless ultrasound probe, which includes analog front ends and beamformation modules. This paper focuses on the investigation of the effects of architectural design choices on the image quality. The point...

  10. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  11. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  12. Crucial steps to life: From chemical reactions to code using agents.

    Science.gov (United States)

    Witzany, Guenther

    2016-02-01

    The concepts of the origin of the genetic code and the definitions of life changed dramatically after the RNA world hypothesis. Main narratives in molecular biology and genetics such as the "central dogma," "one gene one protein" and "non-coding DNA is junk" were falsified meanwhile. RNA moved from the transition intermediate molecule into centre stage. Additionally the abundance of empirical data concerning non-random genetic change operators such as the variety of mobile genetic elements, persistent viruses and defectives do not fit with the dominant narrative of error replication events (mutations) as being the main driving forces creating genetic novelty and diversity. The reductionistic and mechanistic views on physico-chemical properties of the genetic code are no longer convincing as appropriate descriptions of the abundance of non-random genetic content operators which are active in natural genetic engineering and natural genome editing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  14. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  15. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  16. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  17. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  18. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  19. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  20. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  1. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  2. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  3. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  4. Genetic modification and genetic determinism

    Science.gov (United States)

    Resnik, David B; Vorhaus, Daniel B

    2006-01-01

    In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions. PMID:16800884

  5. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  6. System Level Design of a Continuous-Time Delta-Sigma Modulator for Portable Ultrasound Scanners

    DEFF Research Database (Denmark)

    Llimos Muntal, Pere; Færch, Kjartan; Jørgensen, Ivan Harald Holger

    2015-01-01

    In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match these requir......, based on high-level VerilogA simulations, the performance of the ∆Σ modulator versus various block performance parameters is presented as trade-off curves. Based on these results, the block specifications are derived.......In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match...

  7. System-Level Optimization of a DAC for Hearing-Aid Audio Class D Output Stage

    DEFF Research Database (Denmark)

    Pracný, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2013-01-01

    This paper deals with system-level optimization of a digital-to-analog converter (DAC) for hearing-aid audio Class D output stage. We discuss the ΣΔ modulator system-level design parameters – the order, the oversampling ratio (OSR) and the number of bits in the quantizer. We show that combining...... by comparing two ΣΔ modulator designs. The proposed optimization has impact on the whole hearing-aid audio back-end system including less hardware in the interpolation filter and half the switching rate in the digital-pulse-width-modulation (DPWM) block and Class D output stage...... a reduction of the OSR with an increase of the order results in considerable power savings while the audio quality is kept. For further savings in the ΣΔ modulator, overdesign and subsequent coarse coefficient quantization are used. A figure of merit (FOM) is introduced to confirm this optimization approach...

  8. Next Generation Civil Transport Aircraft Design Considerations for Improving Vehicle and System-Level Efficiency

    Science.gov (United States)

    Acosta, Diana M.; Guynn, Mark D.; Wahls, Richard A.; DelRosario, Ruben,

    2013-01-01

    The future of aviation will benefit from research in aircraft design and air transportation management aimed at improving efficiency and reducing environmental impacts. This paper presents civil transport aircraft design trends and opportunities for improving vehicle and system-level efficiency. Aircraft design concepts and the emerging technologies critical to reducing thrust specific fuel consumption, reducing weight, and increasing lift to drag ratio currently being developed by NASA are discussed. Advancements in the air transportation system aimed towards system-level efficiency are discussed as well. Finally, the paper describes the relationship between the air transportation system, aircraft, and efficiency. This relationship is characterized by operational constraints imposed by the air transportation system that influence aircraft design, and operational capabilities inherent to an aircraft design that impact the air transportation system.

  9. A system level boundary scan controller board for VME applications [to CERN experiments

    CERN Document Server

    Cardoso, N; Da Silva, J C

    2000-01-01

    This work is the result of a collaboration between INESC and LIP in the CMS experiment being conducted at CERN. The collaboration addresses the application of boundary scan test at system level namely the development of a VME boundary scan controller (BSC) board prototype and the corresponding software. This prototype uses the MTM bus existing in the VME64* backplane to apply the 1149.1 test vectors to a system composed of nineteen boards, called here units under test (UUTs). A top-down approach is used to describe our work. The paper begins with some insights about the experiment being conducted at CERN, proceed with system level considerations concerning our work and with some details about the BSC board. The results obtained so far and the proposed work is reviewed in the end of this contribution. (11 refs).

  10. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  11. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  12. System-level perturbations of cell metabolism using CRISPR/Cas9

    DEFF Research Database (Denmark)

    Jakociunas, Tadas; Jensen, Michael Krogh; Keasling, Jay

    2017-01-01

    CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied...... previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering....

  13. Competition, liquidity and stability: international evidence at the bank and systemic levels

    OpenAIRE

    Nguyen, Thi Ngoc My

    2017-01-01

    This thesis investigates the impact of market power on bank liquidity; the association between competition and systemic liquidity; and whether the associations between liquidity and stability at both bank- and systemic- levels are affected by competition. The first research question is explored in the context of 101 countries over 1996-2013 while the second and the third, which require listed banks, use a smaller sample of 32 nations during 2001-2013. The Panel Least Squares and the system Ge...

  14. System-level modeling for economic evaluation of geological CO2 storage in gas reservoirs

    International Nuclear Information System (INIS)

    Zhang, Yingqi; Oldenburg, Curtis M.; Finsterle, Stefan; Bodvarsson, Gudmundur S.

    2007-01-01

    One way to reduce the effects of anthropogenic greenhouse gases on climate is to inject carbon dioxide (CO 2 ) from industrial sources into deep geological formations such as brine aquifers or depleted oil or gas reservoirs. Research is being conducted to improve understanding of factors affecting particular aspects of geological CO 2 storage (such as storage performance, storage capacity, and health, safety and environmental (HSE) issues) as well as to lower the cost of CO 2 capture and related processes. However, there has been less emphasis to date on system-level analyses of geological CO 2 storage that consider geological, economic, and environmental issues by linking detailed process models to representations of engineering components and associated economic models. The objective of this study is to develop a system-level model for geological CO 2 storage, including CO 2 capture and separation, compression, pipeline transportation to the storage site, and CO 2 injection. Within our system model we are incorporating detailed reservoir simulations of CO 2 injection into a gas reservoir and related enhanced production of methane. Potential leakage and associated environmental impacts are also considered. The platform for the system-level model is GoldSim [GoldSim User's Guide. GoldSim Technology Group; 2006, http://www.goldsim.com]. The application of the system model focuses on evaluating the feasibility of carbon sequestration with enhanced gas recovery (CSEGR) in the Rio Vista region of California. The reservoir simulations are performed using a special module of the TOUGH2 simulator, EOS7C, for multicomponent gas mixtures of methane and CO 2 . Using a system-level modeling approach, the economic benefits of enhanced gas recovery can be directly weighed against the costs and benefits of CO 2 injection

  15. From Genetics to Genetic Algorithms

    Indian Academy of Sciences (India)

    Genetic algorithms (GAs) are computational optimisation schemes with an ... The algorithms solve optimisation problems ..... Genetic Algorithms in Search, Optimisation and Machine. Learning, Addison-Wesley Publishing Company, Inc. 1989.

  16. From Genetics to Genetic Algorithms

    Indian Academy of Sciences (India)

    artificial genetic system) string feature or ... called the genotype whereas it is called a structure in artificial genetic ... assigned a fitness value based on the cost function. Better ..... way it has produced complex, intelligent living organisms capable of ...

  17. System-level energy efficiency is the greatest barrier to development of the hydrogen economy

    International Nuclear Information System (INIS)

    Page, Shannon; Krumdieck, Susan

    2009-01-01

    Current energy research investment policy in New Zealand is based on assumed benefits of transitioning to hydrogen as a transport fuel and as storage for electricity from renewable resources. The hydrogen economy concept, as set out in recent commissioned research investment policy advice documents, includes a range of hydrogen energy supply and consumption chains for transport and residential energy services. The benefits of research and development investments in these advice documents were not fully analyzed by cost or improvements in energy efficiency or green house gas emissions reduction. This paper sets out a straightforward method to quantify the system-level efficiency of these energy chains. The method was applied to transportation and stationary heat and power, with hydrogen generated from wind energy, natural gas and coal. The system-level efficiencies for the hydrogen chains were compared to direct use of conventionally generated electricity, and with internal combustion engines operating on gas- or coal-derived fuel. The hydrogen energy chains were shown to provide little or no system-level efficiency improvement over conventional technology. The current research investment policy is aimed at enabling a hydrogen economy without considering the dramatic loss of efficiency that would result from using this energy carrier.

  18. Integrating Omics Technologies to Study Pulmonary Physiology and Pathology at the Systems Level

    Directory of Open Access Journals (Sweden)

    Ravi Ramesh Pathak

    2014-04-01

    Full Text Available Assimilation and integration of “omics” technologies, including genomics, epigenomics, proteomics, and metabolomics has readily altered the landscape of medical research in the last decade. The vast and complex nature of omics data can only be interpreted by linking molecular information at the organismic level, forming the foundation of systems biology. Research in pulmonary biology/medicine has necessitated integration of omics, network, systems and computational biology data to differentially diagnose, interpret, and prognosticate pulmonary diseases, facilitating improvement in therapy and treatment modalities. This review describes how to leverage this emerging technology in understanding pulmonary diseases at the systems level -called a “systomic” approach. Considering the operational wholeness of cellular and organ systems, diseased genome, proteome, and the metabolome needs to be conceptualized at the systems level to understand disease pathogenesis and progression. Currently available omics technology and resources require a certain degree of training and proficiency in addition to dedicated hardware and applications, making them relatively less user friendly for the pulmonary biologist and clinicians. Herein, we discuss the various strategies, computational tools and approaches required to study pulmonary diseases at the systems level for biomedical scientists and clinical researchers.

  19. Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling

    Science.gov (United States)

    Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw

    2005-01-01

    The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.

  20. Optimal maintenance policy incorporating system level and unit level for mechanical systems

    Science.gov (United States)

    Duan, Chaoqun; Deng, Chao; Wang, Bingran

    2018-04-01

    The study works on a multi-level maintenance policy combining system level and unit level under soft and hard failure modes. The system experiences system-level preventive maintenance (SLPM) when the conditional reliability of entire system exceeds SLPM threshold, and also undergoes a two-level maintenance for each single unit, which is initiated when a single unit exceeds its preventive maintenance (PM) threshold, and the other is performed simultaneously the moment when any unit is going for maintenance. The units experience both periodic inspections and aperiodic inspections provided by failures of hard-type units. To model the practical situations, two types of economic dependence have been taken into account, which are set-up cost dependence and maintenance expertise dependence due to the same technology and tool/equipment can be utilised. The optimisation problem is formulated and solved in a semi-Markov decision process framework. The objective is to find the optimal system-level threshold and unit-level thresholds by minimising the long-run expected average cost per unit time. A formula for the mean residual life is derived for the proposed multi-level maintenance policy. The method is illustrated by a real case study of feed subsystem from a boring machine, and a comparison with other policies demonstrates the effectiveness of our approach.

  1. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  2. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  3. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  4. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  5. Journal of Genetics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics. Amit Katiyar. Articles written in Journal of Genetics. Volume 92 Issue 3 December 2013 pp 363-368 Research Article. Expression profile of genes coding for carotenoid biosynthetic pathway during ripening and their association with accumulation of lycopene in tomato fruits.

  6. About Genetic Counselors

    Science.gov (United States)

    ... clinical care in many areas of medicine. Assisted Reproductive Technology/Infertility Genetics Cancer Genetics Cardiovascular Genetics Cystic Fibrosis Genetics Fetal Intervention and Therapy Genetics Hematology Genetics Metabolic Genetics ...

  7. On fuzzy semantic similarity measure for DNA coding.

    Science.gov (United States)

    Ahmad, Muneer; Jung, Low Tang; Bhuiyan, Md Al-Amin

    2016-02-01

    A coding measure scheme numerically translates the DNA sequence to a time domain signal for protein coding regions identification. A number of coding measure schemes based on numerology, geometry, fixed mapping, statistical characteristics and chemical attributes of nucleotides have been proposed in recent decades. Such coding measure schemes lack the biologically meaningful aspects of nucleotide data and hence do not significantly discriminate coding regions from non-coding regions. This paper presents a novel fuzzy semantic similarity measure (FSSM) coding scheme centering on FSSM codons׳ clustering and genetic code context of nucleotides. Certain natural characteristics of nucleotides i.e. appearance as a unique combination of triplets, preserving special structure and occurrence, and ability to own and share density distributions in codons have been exploited in FSSM. The nucleotides׳ fuzzy behaviors, semantic similarities and defuzzification based on the center of gravity of nucleotides revealed a strong correlation between nucleotides in codons. The proposed FSSM coding scheme attains a significant enhancement in coding regions identification i.e. 36-133% as compared to other existing coding measure schemes tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  9. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  10. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  11. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  12. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  13. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  14. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  15. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  16. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  17. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  18. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  19. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  20. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  1. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  2. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  3. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  4. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  6. The effect of decentralized behavioral decision making on system-level risk.

    Science.gov (United States)

    Kaivanto, Kim

    2014-12-01

    Certain classes of system-level risk depend partly on decentralized lay decision making. For instance, an organization's network security risk depends partly on its employees' responses to phishing attacks. On a larger scale, the risk within a financial system depends partly on households' responses to mortgage sales pitches. Behavioral economics shows that lay decisionmakers typically depart in systematic ways from the normative rationality of expected utility (EU), and instead display heuristics and biases as captured in the more descriptively accurate prospect theory (PT). In turn, psychological studies show that successful deception ploys eschew direct logical argumentation and instead employ peripheral-route persuasion, manipulation of visceral emotions, urgency, and familiar contextual cues. The detection of phishing emails and inappropriate mortgage contracts may be framed as a binary classification task. Signal detection theory (SDT) offers the standard normative solution, formulated as an optimal cutoff threshold, for distinguishing between good/bad emails or mortgages. In this article, we extend SDT behaviorally by rederiving the optimal cutoff threshold under PT. Furthermore, we incorporate the psychology of deception into determination of SDT's discriminability parameter. With the neo-additive probability weighting function, the optimal cutoff threshold under PT is rendered unique under well-behaved sampling distributions, tractable in computation, and transparent in interpretation. The PT-based cutoff threshold is (i) independent of loss aversion and (ii) more conservative than the classical SDT cutoff threshold. Independently of any possible misalignment between individual-level and system-level misclassification costs, decentralized behavioral decisionmakers are biased toward underdetection, and system-level risk is consequently greater than in analyses predicated upon normative rationality. © 2014 Society for Risk Analysis.

  7. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  8. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  9. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  10. Study on the system-level test method of digital metering in smart substation

    Science.gov (United States)

    Zhang, Xiang; Yang, Min; Hu, Juan; Li, Fuchao; Luo, Ruixi; Li, Jinsong; Ai, Bing

    2017-03-01

    Nowadays, the test methods of digital metering system in smart substation are used to test and evaluate the performance of a single device, but these methods can only effectively guarantee the accuracy and reliability of the measurement results of a digital metering device in a single run, it does not completely reflect the performance when each device constitutes a complete system. This paper introduced the shortages of the existing test methods. A system-level test method of digital metering in smart substation was proposed, and the feasibility of the method was proved by the actual test.

  11. Enhanced Discrete-Time Scheduler Engine for MBMS E-UMTS System Level Simulator

    DEFF Research Database (Denmark)

    Pratas, Nuno; Rodrigues, António

    2007-01-01

    In this paper the design of an E-UMTS system level simulator developed for the study of optimization methods for the MBMS is presented. The simulator uses a discrete event based philosophy, which captures the dynamic behavior of the Radio Network System. This dynamic behavior includes the user...... mobility, radio interfaces and the Radio Access Network. Its given emphasis on the enhancements developed for the simulator core, the Event Scheduler Engine. Two implementations for the Event Scheduler Engine are proposed, one optimized for single core processors and other for multi-core ones....

  12. Exploration of a digital audio processing platform using a compositional system level performance estimation framework

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents the application of a compositional simulation based system-level performance estimation framework on a non-trivial industrial case study. The case study is provided by the Danish company Bang & Olufsen ICEpower a/s and focuses on the exploration of a digital mobile audio...... processing platform. A short overview of the compositional performance estimation framework used is given followed by a presentation of how it is used for performance estimation using an iterative refinement process towards the final implementation. Finally, an evaluation in terms of accuracy and speed...

  13. System-level modeling and simulation of the cell culture microfluidic biochip ProCell

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2010-01-01

    Microfluidic biochips offer a promising alternative to a conventional biochemical laboratory. There are two technologies for the microfluidic biochips: droplet-based and flow-based. In this paper we are interested in flow-based microfluidic biochips, where the liquid flows continuously through pre......-defined micro-channels using valves and pumps. We present an approach to the system-level modeling and simulation of a cell culture microfluidic biochip called ProCell, Programmable Cell Culture Chip. ProCell contains a cell culture chamber, which is envisioned to run 256 simultaneous experiments (viewed...

  14. Abstract Radio Resource Management Framework for System Level Simulations in LTE-A Systems

    DEFF Research Database (Denmark)

    Fotiadis, Panagiotis; Viering, Ingo; Zanier, Paolo

    2014-01-01

    This paper provides a simple mathematical model of different packet scheduling policies in Long Term Evolution- Advanced (LTE-A) systems, by investigating the performance of Proportional Fair (PF) and the generalized cross-Component Carrier scheduler from a theoretical perspective. For that purpose......, an abstract Radio Resource Management (RRM) framework has been developed and tested for different ratios of users with Carrier Aggregation (CA) capabilities. The conducted system level simulations confirm that the proposed model can satisfactorily capture the main properties of the aforementioned scheduling...

  15. Design of power converter in DFIG wind turbine with enhanced system-level reliability

    DEFF Research Database (Denmark)

    Zhou, Dao; Zhang, Guanguan; Blaabjerg, Frede

    2017-01-01

    With the increasing penetration of wind power, reliable and cost-effective wind energy production are of more and more importance. As one of the promising configurations, the doubly-fed induction generator based partial-scale wind power converter is still dominating in the existing wind farms...... margin. It can be seen that the B1 lifetime of the grid-side converter and the rotor-side converter deviates a lot by considering the electrical stresses, while they become more balanced by using an optimized reliable design. The system-level lifetime significantly increases with an appropriate design...

  16. A system-level multiprocessor system-on-chip modeling framework

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2004-01-01

    We present a system-level modeling framework to model system-on-chips (SoC) consisting of heterogeneous multiprocessors and network-on-chip communication structures in order to enable the developers of today's SoC designs to take advantage of the flexibility and scalability of network-on-chip and...... SoC design. We show how a hand-held multimedia terminal, consisting of JPEG, MP3 and GSM applications, can be modeled as a multiprocessor SoC in our framework....

  17. Out-of-order parallel discrete event simulation for electronic system-level design

    CERN Document Server

    Chen, Weiwei

    2014-01-01

    This book offers readers a set of new approaches and tools a set of tools and techniques for facing challenges in parallelization with design of embedded systems.? It provides an advanced parallel simulation infrastructure for efficient and effective system-level model validation and development so as to build better products in less time.? Since parallel discrete event simulation (PDES) has the potential to exploit the underlying parallel computational capability in today's multi-core simulation hosts, the author begins by reviewing the parallelization of discrete event simulation, identifyin

  18. System-level perturbations of cell metabolism using CRISPR/Cas9

    Energy Technology Data Exchange (ETDEWEB)

    Jakočiūnas, Tadas [Technical Univ. of Denmark, Lyngby (Denmark); Jensen, Michael K. [Technical Univ. of Denmark, Lyngby (Denmark); Keasling, Jay D. [Technical Univ. of Denmark, Lyngby (Denmark); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States)

    2017-03-30

    CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies much more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied to single and multiplex pathway modifications and transcriptional regulations. The effectiveness of these tools allows researchers to implement genome-wide perturbations, test model-guided genome editing strategies, and perform transcriptional reprogramming perturbations in a more advanced manner than previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering.

  19. System Level Power Optimization of Digital Audio Back End for Hearing Aids

    DEFF Research Database (Denmark)

    Pracny, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2017-01-01

    This work deals with power optimization of the audio processing back end for hearing aids - the interpolation filter (IF), the sigma-delta (SD modulator and the Class D power amplifier (PA) as a whole. Specifications are derived and insight into the tradeoffs involved is used to optimize...... the interpolation filter and the SD modulator on the system level so that the switching frequency of the Class D PA - the main power consumer in the back end - is minimized. A figure-of-merit (FOM) which allows judging the power consumption of the digital part of the back end early in the design process is used...

  20. Empirical LTE Smartphone Power Model with DRX Operation for System Level Simulations

    DEFF Research Database (Denmark)

    Lauridsen, Mads; Noël, Laurent; Mogensen, Preben

    2013-01-01

    An LTE smartphone power model is presented to enable academia and industry to evaluate users’ battery life on system level. The model is based on empirical measurements on a smartphone using a second generation LTE chipset, and the model includes functions of receive and transmit data rates...... and power levels. The first comprehensive Discontinuous Reception (DRX) power consumption measurements are reported together with cell bandwidth, screen and CPU power consumption. The transmit power level and to some extent the receive data rate constitute the overall power consumption, while DRX proves...

  1. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  2. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  3. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  4. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  5. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  6. Evolving a Dynamic Predictive Coding Mechanism for Novelty Detection

    OpenAIRE

    Haggett, Simon J.; Chu, Dominique; Marshall, Ian W.

    2007-01-01

    Novelty detection is a machine learning technique which identifies new or unknown information in data sets. We present our current work on the construction of a new novelty detector based on a dynamical version of predictive coding. We compare three evolutionary algorithms, a simple genetic algorithm, NEAT and FS-NEAT, for the task of optimising the structure of an illustrative dynamic predictive coding neural network to improve its performance over stimuli from a number of artificially gener...

  7. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  8. Discovery of Proteomic Code with mRNA Assisted Protein Folding

    Directory of Open Access Journals (Sweden)

    Jan C. Biro

    2008-12-01

    Full Text Available The 3x redundancy of the Genetic Code is usually explained as a necessity to increase the mutation-resistance of the genetic information. However recent bioinformatical observations indicate that the redundant Genetic Code contains more biological information than previously known and which is additional to the 64/20 definition of amino acids. It might define the physico-chemical and structural properties of amino acids, the codon boundaries, the amino acid co-locations (interactions in the coded proteins and the free folding energy of mRNAs. This additional information, which seems to be necessary to determine the 3D structure of coding nucleic acids as well as the coded proteins, is known as the Proteomic Code and mRNA Assisted Protein Folding.

  9. Genetic modification and genetic determinism

    Directory of Open Access Journals (Sweden)

    Vorhaus Daniel B

    2006-06-01

    Full Text Available Abstract In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions.

  10. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  11. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  12. The Coding Question.

    Science.gov (United States)

    Gallistel, C R

    2017-07-01

    Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  14. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  15. An investigation into soft error detection efficiency at operating system level.

    Science.gov (United States)

    Asghari, Seyyed Amir; Kaynak, Okyay; Taheri, Hassan

    2014-01-01

    Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  16. Local and System Level Considerations for Plasma-Based Techniques in Hypersonic Flight

    Science.gov (United States)

    Suchomel, Charles; Gaitonde, Datta

    2007-01-01

    The harsh environment encountered due to hypersonic flight, particularly when air-breathing propulsion devices are utilized, poses daunting challenges to successful maturation of suitable technologies. This has spurred the quest for revolutionary solutions, particularly those exploiting the fact that air under these conditions can become electrically conducting either naturally or through artificial enhancement. Optimized development of such concepts must emphasize not only the detailed physics by which the fluid interacts with the imposed electromagnetic fields, but must also simultaneously identify system level issues integration and efficiencies that provide the greatest leverage. This paper presents some recent advances at both levels. At the system level, an analysis is summarized that incorporates the interdependencies occurring between weight, power and flow field performance improvements. Cruise performance comparisons highlight how one drag reduction device interacts with the vehicle to improve range. Quantified parameter interactions allow specification of system requirements and energy consuming technologies that affect overall flight vehicle performance. Results based on on the fundamental physics are presented by distilling numerous computational studies into a few guiding principles. These highlight the complex non-intuitive relationships between the various fluid and electromagnetic fields, together with thermodynamic considerations. Generally, energy extraction is an efficient process, while the reverse is accompanied by significant dissipative heating and inefficiency. Velocity distortions can be detrimental to plasma operation, but can be exploited to tailor flows through innovative electromagnetic configurations.

  17. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  18. Self-Driving Cars and Engineering Ethics: The Need for a System Level Analysis.

    Science.gov (United States)

    Borenstein, Jason; Herkert, Joseph R; Miller, Keith W

    2017-11-13

    The literature on self-driving cars and ethics continues to grow. Yet much of it focuses on ethical complexities emerging from an individual vehicle. That is an important but insufficient step towards determining how the technology will impact human lives and society more generally. What must complement ongoing discussions is a broader, system level of analysis that engages with the interactions and effects that these cars will have on one another and on the socio-technical systems in which they are embedded. To bring the conversation of self-driving cars to the system level, we make use of two traffic scenarios which highlight some of the complexities that designers, policymakers, and others should consider related to the technology. We then describe three approaches that could be used to address such complexities and their associated shortcomings. We conclude by bringing attention to the "Moral Responsibility for Computing Artifacts: The Rules", a framework that can provide insight into how to approach ethical issues related to self-driving cars.

  19. Value of information in sequential decision making: Component inspection, permanent monitoring and system-level scheduling

    International Nuclear Information System (INIS)

    Memarzadeh, Milad; Pozzi, Matteo

    2016-01-01

    We illustrate how to assess the Value of Information (VoI) in sequential decision making problems modeled by Partially Observable Markov Decision Processes (POMDPs). POMDPs provide a general framework for modeling the management of infrastructure components, including operation and maintenance, when only partial or noisy observations are available; VoI is a key concept for selecting explorative actions, with application to component inspection and monitoring. Furthermore, component-level VoI can serve as an effective heuristic for assigning priorities to system-level inspection scheduling. We introduce two alternative models for the availability of information, and derive the VoI in each of those settings: the Stochastic Allocation (SA) model assumes that observations are collected with a given probability, while the Fee-based Allocation model (FA) assumes that they are available at a given cost. After presenting these models at component-level, we investigate how they perform for system-level inspection scheduling. - Highlights: • On the Value of Information in POMDPs, for optimal exploration of systems. • A method for assessing the Value of Information of permanent monitoring. • A method for allocating inspections in systems made up by parallel POMDPs.

  20. An Investigation into Soft Error Detection Efficiency at Operating System Level

    Directory of Open Access Journals (Sweden)

    Seyyed Amir Asghari

    2014-01-01

    Full Text Available Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  1. Virtual design and optimization studies for industrial silicon microphones applying tailored system-level modeling

    Science.gov (United States)

    Kuenzig, Thomas; Dehé, Alfons; Krumbein, Ulrich; Schrag, Gabriele

    2018-05-01

    Maxing out the technological limits in order to satisfy the customers’ demands and obtain the best performance of micro-devices and-systems is a challenge of today’s manufacturers. Dedicated system simulation is key to investigate the potential of device and system concepts in order to identify the best design w.r.t. the given requirements. We present a tailored, physics-based system-level modeling approach combining lumped with distributed models that provides detailed insight into the device and system operation at low computational expense. The resulting transparent, scalable (i.e. reusable) and modularly composed models explicitly contain the physical dependency on all relevant parameters, thus being well suited for dedicated investigation and optimization of MEMS devices and systems. This is demonstrated for an industrial capacitive silicon microphone. The performance of such microphones is determined by distributed effects like viscous damping and inhomogeneous capacitance variation across the membrane as well as by system-level phenomena like package-induced acoustic effects and the impact of the electronic circuitry for biasing and read-out. The here presented model covers all relevant figures of merit and, thus, enables to evaluate the optimization potential of silicon microphones towards high fidelity applications. This work was carried out at the Technical University of Munich, Chair for Physics of Electrotechnology. Thomas Kuenzig is now with Infineon Technologies AG, Neubiberg.

  2. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  3. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  4. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  5. A System-level Infrastructure for Multi-dimensional MP-SoC Design Space Co-exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Bautista, T.; Nunez, A.; Pimentel, A.D.; Thompson, M.

    2013-01-01

    In this article, we present a flexible and extensible system-level MP-SoC design space exploration (DSE) infrastructure, called NASA. This highly modular framework uses well-defined interfaces to easily integrate different system-level simulation tools as well as different combinations of search

  6. Foundational development of an advanced nuclear reactor integrated safety code

    International Nuclear Information System (INIS)

    Clarno, Kevin; Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth; Hooper, Russell Warren; Humphries, Larry LaRon

    2010-01-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  7. Foundational development of an advanced nuclear reactor integrated safety code.

    Energy Technology Data Exchange (ETDEWEB)

    Clarno, Kevin (Oak Ridge National Laboratory, Oak Ridge, TN); Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth (Ktech Corporation, Albuquerque, NM); Hooper, Russell Warren; Humphries, Larry LaRon

    2010-02-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  8. Code of Medical Ethics

    Directory of Open Access Journals (Sweden)

    . SZD-SZZ

    2017-03-01

    Full Text Available Te Code was approved on December 12, 1992, at the 3rd regular meeting of the General Assembly of the Medical Chamber of Slovenia and revised on April 24, 1997, at the 27th regular meeting of the General Assembly of the Medical Chamber of Slovenia. The Code was updated and harmonized with the Medical Association of Slovenia and approved on October 6, 2016, at the regular meeting of the General Assembly of the Medical Chamber of Slovenia.

  9. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  10. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  11. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  12. Genetic Engineering

    Science.gov (United States)

    Phillips, John

    1973-01-01

    Presents a review of genetic engineering, in which the genotypes of plants and animals (including human genotypes) may be manipulated for the benefit of the human species. Discusses associated problems and solutions and provides an extensive bibliography of literature relating to genetic engineering. (JR)

  13. Genetic Romanticism

    DEFF Research Database (Denmark)

    Tupasela, Aaro

    2016-01-01

    inheritance as a way to unify populations within politically and geographically bounded areas. Thus, new genetics have contributed to the development of genetic romanticisms, whereby populations (human, plant, and animal) can be delineated and mobilized through scientific and medical practices to represent...

  14. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  15. Dealing with an Unconventional Genetic Code in  Mitochondria: The Biogenesis and Pathogenic  Defects of the 5‐Formylcytosine Modification in  Mitochondrial tRNAMet

    Directory of Open Access Journals (Sweden)

    Lindsey Van Haute

    2017-03-01

    Full Text Available Human mitochondria contain their own genome, which uses an unconventional genetic code. In addition to the standard AUG methionine codon, the single mitochondrial tRNA Methionine (mt‐tRNAMet also recognises AUA during translation initiation and elongation. Post‐transcriptional modifications of tRNAs are important for structure, stability, correct folding and aminoacylation as well as decoding. The unique 5‐formylcytosine (f5C modification of position 34 in mt‐tRNAMet has been long postulated to be crucial for decoding of unconventional methionine codons and efficient mitochondrial translation. However, the enzymes responsible for the formation of mitochondrial f5C have been identified only recently. The first step of the f5C pathway consists of methylation of cytosine by NSUN3. This is followed by further oxidation by ABH1. Here, we review the role of f5C, the latest breakthroughs in our understanding of the biogenesis of this unique mitochondrial tRNA modification and its involvement in human disease.

  16. Dual Coding in Children.

    Science.gov (United States)

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  17. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  18. Radioactive action code

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    A new coding system, 'Hazrad', for buildings and transportation containers for alerting emergency services personnel to the presence of radioactive materials has been developed in the United Kingdom. The hazards of materials in the buildings or transport container, together with the recommended emergency action, are represented by a number of codes which are marked on the building or container and interpreted from a chart carried as a pocket-size guide. Buildings would be marked with the familiar yellow 'radioactive' trefoil, the written information 'Radioactive materials' and a list of isotopes. Under this the 'Hazrad' code would be written - three symbols to denote the relative radioactive risk (low, medium or high), the biological risk (also low, medium or high) and the third showing the type of radiation emitted, alpha, beta or gamma. The response cards indicate appropriate measures to take, eg for a high biological risk, Bio3, the wearing of a gas-tight protection suit is advised. The code and its uses are explained. (U.K.)

  19. Building Codes and Regulations.

    Science.gov (United States)

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  20. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  1. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  2. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  3. CERN Code of Conduct

    CERN Document Server

    Department, HR

    2010-01-01

    The Code is intended as a guide in helping us, as CERN contributors, to understand how to conduct ourselves, treat others and expect to be treated. It is based around the five core values of the Organization. We should all become familiar with it and try to incorporate it into our daily life at CERN.

  4. Nuclear safety code study

    Energy Technology Data Exchange (ETDEWEB)

    Hu, H.H.; Ford, D.; Le, H.; Park, S.; Cooke, K.L.; Bleakney, T.; Spanier, J.; Wilburn, N.P.; O' Reilly, B.; Carmichael, B.

    1981-01-01

    The objective is to analyze an overpower accident in an LMFBR. A simplified model of the primary coolant loop was developed in order to understand the instabilities encountered with the MELT III and SAS codes. The computer programs were translated for switching to the IBM 4331. Numerical methods were investigated for solving the neutron kinetics equations; the Adams and Gear methods were compared. (DLC)

  5. Revised C++ coding conventions

    CERN Document Server

    Callot, O

    2001-01-01

    This document replaces the note LHCb 98-049 by Pavel Binko. After a few years of practice, some simplification and clarification of the rules was needed. As many more people have now some experience in writing C++ code, their opinion was also taken into account to get a commonly agreed set of conventions

  6. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  7. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    information and coding theory. A large scale relay computer had failed to deliver the expected results due to a hardware fault. Hamming, one of the active proponents of computer usage, was determined to find an efficient means by which computers could detect and correct their own faults. A mathematician by train-.

  8. Broadcast Coded Slotted ALOHA

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Brännström, Frederik; Graell i Amat, Alexandre

    2016-01-01

    We propose an uncoordinated medium access control (MAC) protocol, called all-to-all broadcast coded slotted ALOHA (B-CSA) for reliable all-to-all broadcast with strict latency constraints. In B-CSA, each user acts as both transmitter and receiver in a half-duplex mode. The half-duplex mode gives ...

  9. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  10. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  11. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  12. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    having a probability Pi of being equal to a 1. Let us assume ... equal to a 0/1 has no bearing on the probability of the. It is often ... bits (call this set S) whose individual bits add up to zero ... In the context of binary error-correct~ng codes, specifi-.

  13. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  14. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  15. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. DNA: Polymer and molecular code

    Science.gov (United States)

    Shivashankar, G. V.

    1999-10-01

    The thesis work focusses upon two aspects of DNA, the polymer and the molecular code. Our approach was to bring single molecule micromanipulation methods to the study of DNA. It included a home built optical microscope combined with an atomic force microscope and an optical tweezer. This combined approach led to a novel method to graft a single DNA molecule onto a force cantilever using the optical tweezer and local heating. With this method, a force versus extension assay of double stranded DNA was realized. The resolution was about 10 picoN. To improve on this force measurement resolution, a simple light backscattering technique was developed and used to probe the DNA polymer flexibility and its fluctuations. It combined the optical tweezer to trap a DNA tethered bead and the laser backscattering to detect the beads Brownian fluctuations. With this technique the resolution was about 0.1 picoN with a millisecond access time, and the whole entropic part of the DNA force-extension was measured. With this experimental strategy, we measured the polymerization of the protein RecA on an isolated double stranded DNA. We observed the progressive decoration of RecA on the l DNA molecule, which results in the extension of l , due to unwinding of the double helix. The dynamics of polymerization, the resulting change in the DNA entropic elasticity and the role of ATP hydrolysis were the main parts of the study. A simple model for RecA assembly on DNA was proposed. This work presents a first step in the study of genetic recombination. Recently we have started a study of equilibrium binding which utilizes fluorescence polarization methods to probe the polymerization of RecA on single stranded DNA. In addition to the study of material properties of DNA and DNA-RecA, we have developed experiments for which the code of the DNA is central. We studied one aspect of DNA as a molecular code, using different techniques. In particular the programmatic use of template specificity makes

  17. Efficient Uplink Modeling for Dynamic System-Level Simulations of Cellular and Mobile Networks

    Directory of Open Access Journals (Sweden)

    Lobinger Andreas

    2010-01-01

    Full Text Available A novel theoretical framework for uplink simulations is proposed. It allows investigations which have to cover a very long (real- time and which at the same time require a certain level of accuracy in terms of radio resource management, quality of service, and mobility. This is of particular importance for simulations of self-organizing networks. For this purpose, conventional system level simulators are not suitable due to slow simulation speeds far beyond real-time. Simpler, snapshot-based tools are lacking the aforementioned accuracy. The runtime improvements are achieved by deriving abstract theoretical models for the MAC layer behavior. The focus in this work is long term evolution, and the most important uplink effects such as fluctuating interference, power control, power limitation, adaptive transmission bandwidth, and control channel limitations are considered. Limitations of the abstract models will be discussed as well. Exemplary results are given at the end to demonstrate the capability of the derived framework.

  18. Fused Silica Final Optics for Inertial Fusion Energy: Radiation Studies and System-Level Analysis

    International Nuclear Information System (INIS)

    Latkowski, Jeffery F.; Kubota, Alison; Caturla, Maria J.; Dixit, Sham N.; Speth, Joel A.; Payne, Stephen A.

    2003-01-01

    The survivability of the final optic, which must sit in the line of sight of high-energy neutrons and gamma rays, is a key issue for any laser-driven inertial fusion energy (IFE) concept. Previous work has concentrated on the use of reflective optics. Here, we introduce and analyze the use of a transmissive final optic for the IFE application. Our experimental work has been conducted at a range of doses and dose rates, including those comparable to the conditions at the IFE final optic. The experimental work, in conjunction with detailed analysis, suggests that a thin, fused silica Fresnel lens may be an attractive option when used at a wavelength of 351 nm. Our measurements and molecular dynamics simulations provide convincing evidence that the radiation damage, which leads to optical absorption, not only saturates but that a 'radiation annealing' effect is observed. A system-level description is provided, including Fresnel lens and phase plate designs

  19. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  20. Unravelling evolutionary strategies of yeast for improving galactose utilization through integrated systems level analysis

    DEFF Research Database (Denmark)

    Hong, Kuk-Ki; Vongsangnak, Wanwipa; Vemuri, Goutham N

    2011-01-01

    Identification of the underlying molecular mechanisms for a derived phenotype by adaptive evolution is difficult. Here, we performed a systems-level inquiry into the metabolic changes occurring in the yeast Saccharomyces cerevisiae as a result of its adaptive evolution to increase its specific...... showed changes in ergosterol biosynthesis. Mutations were identified in proteins involved in the global carbon sensing Ras/PKA pathway, which is known to regulate the reserve carbohydrates metabolism. We evaluated one of the identified mutations, RAS2(Tyr112), and this mutation resulted in an increased...... design in bioengineering of improved strains and, that through systems biology, it is possible to identify mutations in evolved strain that can serve as unforeseen metabolic engineering targets for improving microbial strains for production of biofuels and chemicals....

  1. System-level Reliability Assessment of Power Stage in Fuel Cell Application

    DEFF Research Database (Denmark)

    Zhou, Dao; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    reliability. In a case study of a 5 kW fuel cell power stage, the parameter variations of the lifetime model prove that the exponential factor of the junction temperature fluctuation is the most sensitive parameter. Besides, if a 5-out-of-6 redundancy is used, it is concluded both the B10 and the B1 system......High efficient and less pollutant fuel cell stacks are emerging and strong candidates of the power solution used for mobile base stations. In the application of the backup power, the availability and reliability hold the highest priority. This paper considers the reliability metrics from...... the component-level to the system-level for the power stage used in a fuel cell application. It starts with an estimation of the annual accumulated damage for the key power electronic components according to the real mission profile of the fuel cell system. Then, considering the parameter variations in both...

  2. System-Level Model for OFDM WiMAX Transceiver in Radiation Environment

    International Nuclear Information System (INIS)

    Abdel Alim, O.; Elboghdadly, N.; Ashour, M.M.; Elaskary, A.M.

    2008-01-01

    WiMAX (Worldwide Inter operability for Microwave Access), an evolving standard for point-to-multipoint wireless networking, works for the l ast mile c onnections for replacing optical fiber technology network but with no need for adding more infra structure within crowded areas. Optical fiber technology is seriously considered for communication and monitoring applications in space and around nuclear reactors. Space and nuclear environments are characterized, in particular, by the presence of ionizing radiation fields. Therefore the influence of radiation on such networks needs to be investigated. This paper has the objective of building a System level model for a WiMAX OFDM (Orthogonal Frequency Division Multiplexing) based transceiver. Modeling irradiation noise as an external effect added to the Additive White Gaussian noise (AWGN). Then analyze, discuss the results based on qualitatively performance evaluation using BER calculations for radiation environment

  3. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  4. Systems-level mechanisms of action of Panax ginseng: a network pharmacological approach.

    Science.gov (United States)

    Park, Sa-Yoon; Park, Ji-Hun; Kim, Hyo-Su; Lee, Choong-Yeol; Lee, Hae-Jeung; Kang, Ki Sung; Kim, Chang-Eop

    2018-01-01

    Panax ginseng has been used since ancient times based on the traditional Asian medicine theory and clinical experiences, and currently, is one of the most popular herbs in the world. To date, most of the studies concerning P. ginseng have focused on specific mechanisms of action of individual constituents. However, in spite of many studies on the molecular mechanisms of P. ginseng , it still remains unclear how multiple active ingredients of P. ginseng interact with multiple targets simultaneously, giving the multidimensional effects on various conditions and diseases. In order to decipher the systems-level mechanism of multiple ingredients of P. ginseng , a novel approach is needed beyond conventional reductive analysis. We aim to review the systems-level mechanism of P. ginseng by adopting novel analytical framework-network pharmacology. Here, we constructed a compound-target network of P. ginseng using experimentally validated and machine learning-based prediction results. The targets of the network were analyzed in terms of related biological process, pathways, and diseases. The majority of targets were found to be related with primary metabolic process, signal transduction, nitrogen compound metabolic process, blood circulation, immune system process, cell-cell signaling, biosynthetic process, and neurological system process. In pathway enrichment analysis of targets, mainly the terms related with neural activity showed significant enrichment and formed a cluster. Finally, relative degrees analysis for the target-disease association of P. ginseng revealed several categories of related diseases, including respiratory, psychiatric, and cardiovascular diseases.

  5. Goal-directed behaviour and instrumental devaluation: a neural system-level computational model

    Directory of Open Access Journals (Sweden)

    Francesco Mannella

    2016-10-01

    Full Text Available Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviours guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers activate the representation of rewards (or `action-outcomes', e.g. foods while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods. The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b the three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and integrates the results of different devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behaviour.

  6. A system-level cost-of-energy wind farm layout optimization with landowner modeling

    International Nuclear Information System (INIS)

    Chen, Le; MacDonald, Erin

    2014-01-01

    Highlights: • We model the role of landowners in determining the success of wind projects. • A cost-of-energy (COE) model with realistic landowner remittances is developed. • These models are included in a system-level wind farm layout optimization. • Basic verification indicates the optimal COE is in-line with real-world data. • Land plots crucial to a project’s success can be identified with the approach. - Abstract: This work applies an enhanced levelized wind farm cost model, including landowner remittance fees, to determine optimal turbine placements under three landowner participation scenarios and two land-plot shapes. Instead of assuming a continuous piece of land is available for the wind farm construction, as in most layout optimizations, the problem formulation represents landowner participation scenarios as a binary string variable, along with the number of turbines. The cost parameters and model are a combination of models from the National Renewable Energy Laboratory (NREL), Lawrence Berkeley National Laboratory, and Windustry. The system-level cost-of-energy (COE) optimization model is also tested under two land-plot shapes: equally-sized square land plots and unequal rectangle land plots. The optimal COEs results are compared to actual COE data and found to be realistic. The results show that landowner remittances account for approximately 10% of farm operating costs across all cases. Irregular land-plot shapes are easily handled by the model. We find that larger land plots do not necessarily receive higher remittance fees. The model can help site developers identify the most crucial land plots for project success and the optimal positions of turbines, with realistic estimates of costs and profitability

  7. System-level hazard analysis using the sequence-tree method

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih Chunkuan; Yih Swu; Chen, M.-H.

    2008-01-01

    A system-level PHA using the sequence-tree method is presented to perform safety-related digital I and C system SSA. The conventional PHA involves brainstorming among experts on various portions of the system to identify hazards through discussions. However, since the conventional PHA is not a systematic technique, the analysis results depend strongly on the experts' subjective opinions. The quality of analysis cannot be appropriately controlled. Therefore, this study presents a system-level sequence tree based PHA, which can clarify the relationship among the major digital I and C systems. This sequence-tree-based technique has two major phases. The first phase adopts a table to analyze each event in SAR Chapter 15 for a specific safety-related I and C system, such as RPS. The second phase adopts a sequence tree to recognize the I and C systems involved in the event, the working of the safety-related systems and how the backup systems can be activated to mitigate the consequence if the primary safety systems fail. The defense-in-depth echelons, namely the Control echelon, Reactor trip echelon, ESFAS echelon and Monitoring and indicator echelon, are arranged to build the sequence-tree structure. All the related I and C systems, including the digital systems and the analog back-up systems, are allocated in their specific echelons. This system-centric sequence-tree analysis not only systematically identifies preliminary hazards, but also vulnerabilities in a nuclear power plant. Hence, an effective simplified D3 evaluation can also be conducted

  8. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  9. A novel neutron energy spectrum unfolding code using particle swarm optimization

    International Nuclear Information System (INIS)

    Shahabinejad, H.; Sohrabpour, M.

    2017-01-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code. - Highlights: • Introducing a novel method for neutron spectrum unfolding. • Implementation of a particle swarm optimization code for neutron unfolding. • Comparing results of the PSO code with those of recently published TGASU code. • Match results of the PSO code with those of TGASU code. • Greater convergence rate of implemented PSO code than TGASU code.

  10. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  11. Evolutionary genetics

    National Research Council Canada - National Science Library

    Maynard Smith, John

    1989-01-01

    .... It differs from other textbooks of population genetics in applying the basic theory to topics, such as social behaviour, molecular evolution, reiterated DNA, and sex, which are the main subjects...

  12. Genetic Discrimination

    Science.gov (United States)

    ... Care Genomic Medicine Working Group New Horizons and Research Patient Management Policy and Ethics Issues Quick Links for Patient Care Education All About the Human Genome Project Fact Sheets Genetic Education Resources for ...

  13. Arthropod Genetics.

    Science.gov (United States)

    Zumwalde, Sharon

    2000-01-01

    Introduces an activity on arthropod genetics that involves phenotype and genotype identification of the creature and the construction process. Includes a list of required materials and directions to build a model arthropod. (YDS)

  14. A systems level approach reveals new gene regulatory modules in the developing ear

    OpenAIRE

    Chen, Jingchen; Tambalo, Monica; Barembaum, Meyer; Ranganathan, Ramya; Simões-Costa, Marcos; Bronner, Marianne E.; Streit, Andrea

    2017-01-01

    The inner ear is a complex vertebrate sense organ, yet it arises from a simple epithelium, the otic placode. Specification towards otic fate requires diverse signals and transcriptional inputs that act sequentially and/or in parallel. Using the chick embryo, we uncover novel genes in the gene regulatory network underlying otic commitment and reveal dynamic changes in gene expression. Functional analysis of selected transcription factors reveals the genetic hierarchy underlying the transition ...

  15. Desktop Genetics

    OpenAIRE

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-01-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learni...

  16. System-level analysis of genes and functions affecting survival during nutrient starvation in Saccharomyces cerevisiae.

    Science.gov (United States)

    Gresham, David; Boer, Viktor M; Caudy, Amy; Ziv, Naomi; Brandt, Nathan J; Storey, John D; Botstein, David

    2011-01-01

    An essential property of all cells is the ability to exit from active cell division and persist in a quiescent state. For single-celled microbes this primarily occurs in response to nutrient deprivation. We studied the genetic requirements for survival of Saccharomyces cerevisiae when starved for either of two nutrients: phosphate or leucine. We measured the survival of nearly all nonessential haploid null yeast mutants in mixed populations using a quantitative sequencing method that estimates the abundance of each mutant on the basis of frequency of unique molecular barcodes. Starvation for phosphate results in a population half-life of 337 hr whereas starvation for leucine results in a half-life of 27.7 hr. To measure survival of individual mutants in each population we developed a statistical framework that accounts for the multiple sources of experimental variation. From the identities of the genes in which mutations strongly affect survival, we identify genetic evidence for several cellular processes affecting survival during nutrient starvation, including autophagy, chromatin remodeling, mRNA processing, and cytoskeleton function. In addition, we found evidence that mitochondrial and peroxisome function is required for survival. Our experimental and analytical methods represent an efficient and quantitative approach to characterizing genetic functions and networks with unprecedented resolution and identified genotype-by-environment interactions that have important implications for interpretation of studies of aging and quiescence in yeast.

  17. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  18. Coded Network Function Virtualization

    DEFF Research Database (Denmark)

    Al-Shuwaili, A.; Simone, O.; Kliewer, J.

    2016-01-01

    Network function virtualization (NFV) prescribes the instantiation of network functions on general-purpose network devices, such as servers and switches. While yielding a more flexible and cost-effective network architecture, NFV is potentially limited by the fact that commercial off......-the-shelf hardware is less reliable than the dedicated network elements used in conventional cellular deployments. The typical solution for this problem is to duplicate network functions across geographically distributed hardware in order to ensure diversity. In contrast, this letter proposes to leverage channel...... coding in order to enhance the robustness on NFV to hardware failure. The proposed approach targets the network function of uplink channel decoding, and builds on the algebraic structure of the encoded data frames in order to perform in-network coding on the signals to be processed at different servers...

  19. The NIMROD Code

    Science.gov (United States)

    Schnack, D. D.; Glasser, A. H.

    1996-11-01

    NIMROD is a new code system that is being developed for the analysis of modern fusion experiments. It is being designed from the beginning to make the maximum use of massively parallel computer architectures and computer graphics. The NIMROD physics kernel solves the three-dimensional, time-dependent two-fluid equations with neo-classical effects in toroidal geometry of arbitrary poloidal cross section. The NIMROD system also includes a pre-processor, a grid generator, and a post processor. User interaction with NIMROD is facilitated by a modern graphical user interface (GUI). The NIMROD project is using Quality Function Deployment (QFD) team management techniques to minimize re-engineering and reduce code development time. This paper gives an overview of the NIMROD project. Operation of the GUI is demonstrated, and the first results from the physics kernel are given.

  20. Expression profile of genes coding for carotenoid biosynthetic ...

    Indian Academy of Sciences (India)

    Expression profile of genes coding for carotenoid biosynthetic pathway during ripening and their association with accumulation of lycopene in tomato fruits. Shuchi Smita, Ravi Rajwanshi, Sangram Keshari Lenka, Amit Katiyar, Viswanathan Chinnusamy and. Kailash Chander Bansal. J. Genet. 92, 363–368. Table 1.

  1. Computer code FIT

    International Nuclear Information System (INIS)

    Rohmann, D.; Koehler, T.

    1987-02-01

    This is a description of the computer code FIT, written in FORTRAN-77 for a PDP 11/34. FIT is an interactive program to decude position, width and intensity of lines of X-ray spectra (max. length of 4K channels). The lines (max. 30 lines per fit) may have Gauss- or Voigt-profile, as well as exponential tails. Spectrum and fit can be displayed on a Tektronix terminal. (orig.) [de

  2. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  3. Code of Practice

    International Nuclear Information System (INIS)

    Doyle, Colin; Hone, Christopher; Nowlan, N.V.

    1984-05-01

    This Code of Practice introduces accepted safety procedures associated with the use of alpha, beta, gamma and X-radiation in secondary schools (pupils aged 12 to 18) in Ireland, and summarises good practice and procedures as they apply to radiation protection. Typical dose rates at various distances from sealed sources are quoted, and simplified equations are used to demonstrate dose and shielding calculations. The regulatory aspects of radiation protection are outlined, and references to statutory documents are given

  4. Tokamak simulation code manual

    International Nuclear Information System (INIS)

    Chung, Moon Kyoo; Oh, Byung Hoon; Hong, Bong Keun; Lee, Kwang Won

    1995-01-01

    The method to use TSC (Tokamak Simulation Code) developed by Princeton plasma physics laboratory is illustrated. In KT-2 tokamak, time dependent simulation of axisymmetric toroidal plasma and vertical stability have to be taken into account in design phase using TSC. In this report physical modelling of TSC are described and examples of application in JAERI and SERI are illustrated, which will be useful when TSC is installed KAERI computer system. (Author) 15 refs., 6 figs., 3 tabs

  5. Status of MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  6. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, transparency, neutrality, impartiality, effectiveness, accountability, and legality. The normative context of public administration, as expressed in codes, seems to ignore the New Public Management and Reinventing Government reform movements....

  7. Orthopedics coding and funding.

    Science.gov (United States)

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic. Copyright © 2014. Published by Elsevier Masson SAS.

  8. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  9. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  10. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  11. Open Genetic Code : On open source in the life sciences

    NARCIS (Netherlands)

    Deibel, E.

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life

  12. The genetic code – Thawing the 'frozen accident'

    Indian Academy of Sciences (India)

    Madhu

    2006-10-04

    Oct 4, 2006 ... The components foremost involved in this process are the. tRNA molecules whose ... that tRNA species with dual identity (capable of being charged by two ... However, the theory has good support, as the CUG codon in ...

  13. Contrasting genetic influence of PON 1 coding gene polymorphisms ...

    African Journals Online (AJOL)

    Nadia Youssef Sadek Morcos

    2015-03-31

    Mar 31, 2015 ... toxicity primarily by inhibiting the enzyme acetylcholinesterase. (AChE) [1]. ... 3. Methods. DNA was extracted from whole blood using a QIAamp Blood .... activity. It is not clear whether this is because of a decreased stability of ...

  14. RNA-DNA sequence differences spell genetic code ambiguities

    DEFF Research Database (Denmark)

    Bentin, Thomas; Nielsen, Michael L

    2013-01-01

    A recent paper in Science by Li et al. 2011(1) reports widespread sequence differences in the human transcriptome between RNAs and their encoding genes termed RNA-DNA differences (RDDs). The findings could add a new layer of complexity to gene expression but the study has been criticized. ...

  15. Origins of gene, genetic code, protein and life

    Indian Academy of Sciences (India)

    We have further presented the [GADV]-protein world hypothesis of the origin of life as well as a hypothesis of protein production, suggesting that proteins were originally produced by random peptide formation of amino acids restricted in specific amino acid compositions termed as GNC-, SNS- and GC-NSF(a)-0th order ...

  16. Origins of gene, genetic code, protein and life: comprehensive view ...

    Indian Academy of Sciences (India)

    Unknown

    production, suggesting that proteins were originally produced by random peptide formation of amino acids restricted in specific amino acid compositions .... using random numbers by a computer, to confirm whether main chains of ...... world on the origin of life by the pseudo-replication of. [GADV]-proteins in the absence of ...

  17. Coevolution mechanisms that adapt viruses to genetic code ...

    Indian Academy of Sciences (India)

    Recent work on virus × host inter- ... of long-term interdependent symbiotic relationship between them. ... Evolution in species of living organisms occurs based on the .... their parents (Francino and Ochman 1999; Lynn et al. 2002; ... dently some dozens of times. ... in the families of certain viruses, bacteria, fungi and inverte-.

  18. Cracking the Genetic Code | NIH MedlinePlus the Magazine

    Science.gov (United States)

    ... of interpretation, which is going to get gradually better over time, so that if somebody makes a discovery that happens to be relevant to you, you learn about it. For the complete transcript and video interview, visit http://www.pbs.org/newshour/ . Click ...

  19. Trends in genetic patent applications: The commercialization of academic intellectual property

    NARCIS (Netherlands)

    Kers, J.G.; van Burg, J.C.; Stoop, T.; Cornel, M.C.

    2014-01-01

    We studied trends in genetic patent applications in order to identify the trends in the commercialization of research findings in genetics. To define genetic patent applications, the European version (ECLA) of the International Patent Classification (IPC) codes was used. Genetic patent applications

  20. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  1. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  2. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  3. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  4. A mathematical model of metabolism an regulation provides a systems-level view of how Escherichia coli responds to oxigen

    NARCIS (Netherlands)

    Ederer, M.; Steinsiek, S.; Stagge, S.; Rolfe, M.D.; ter Beek, A.; Knies, D.; Teixeira De Mattos, M.J.; Sauter, T.; Green, J.; Poole, R.K.; Bettenbrock, K.; Sawodny, O.

    2014-01-01

    The efficient redesign of bacteria for biotechnological purposes, such as biofuel production, waste disposal or specific biocatalytic functions, requires a quantitative systems-level understanding of energy supply, carbon, and redox metabolism. The measurement of transcript levels, metabolite

  5. J. Genet. classic 101

    Indian Academy of Sciences (India)

    Journal of Genetics, Vol. 85, No. 2, August 2006. 101. Page 2. J. Genet. classic. 102. Journal of Genetics, Vol. 85, No. 2, August 2006. Page 3. J. Genet. classic. Journal of Genetics, Vol. 85, No. 2, August 2006. 103. Page 4. J. Genet. classic. 104. Journal of Genetics, Vol. 85, No. 2, August 2006. Page 5. J. Genet. classic.

  6. J. Genet. classic 37

    Indian Academy of Sciences (India)

    Unknown

    Journal of Genetics, Vol. 84, No. 1, April 2005. 37. Page 2. J. Genet. classic. Journal of Genetics, Vol. 84, No. 1, April 2005. 38. Page 3. J. Genet. classic. Journal of Genetics, Vol. 84, No. 1, April 2005. 39. Page 4. J. Genet. classic. Journal of Genetics, Vol. 84, No. 1, April 2005. 40. Page 5. J. Genet. classic. Journal of ...

  7. Bistability in self-activating genes regulated by non-coding RNAs

    International Nuclear Information System (INIS)

    Miro-Bueno, Jesus

    2015-01-01

    Non-coding RNA molecules are able to regulate gene expression and play an essential role in cells. On the other hand, bistability is an important behaviour of genetic networks. Here, we propose and study an ODE model in order to show how non-coding RNA can produce bistability in a simple way. The model comprises a single gene with positive feedback that is repressed by non-coding RNA molecules. We show how the values of all the reaction rates involved in the model are able to control the transitions between the high and low states. This new model can be interesting to clarify the role of non-coding RNA molecules in genetic networks. As well, these results can be interesting in synthetic biology for developing new genetic memories and biomolecular devices based on non-coding RNAs

  8. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  9. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  10. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  11. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  12. Convolutional coding techniques for data protection

    Science.gov (United States)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  13. System-level tools and reconfigurable computing for next-generation HWIL systems

    Science.gov (United States)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  14. A Systems-Level Analysis Reveals Circadian Regulation of Splicing in Colorectal Cancer.

    Science.gov (United States)

    El-Athman, Rukeia; Fuhr, Luise; Relógio, Angela

    2018-06-20

    Accumulating evidence points to a significant role of the circadian clock in the regulation of splicing in various organisms, including mammals. Both dysregulated circadian rhythms and aberrant pre-mRNA splicing are frequently implicated in human disease, in particular in cancer. To investigate the role of the circadian clock in the regulation of splicing in a cancer progression context at the systems-level, we conducted a genome-wide analysis and compared the rhythmic transcriptional profiles of colon carcinoma cell lines SW480 and SW620, derived from primary and metastatic sites of the same patient, respectively. We identified spliceosome components and splicing factors with cell-specific circadian expression patterns including SRSF1, HNRNPLL, ESRP1, and RBM 8A, as well as altered alternative splicing events and circadian alternative splicing patterns of output genes (e.g., VEGFA, NCAM1, FGFR2, CD44) in our cellular model. Our data reveals a remarkable interplay between the circadian clock and pre-mRNA splicing with putative consequences in tumor progression and metastasis. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    Science.gov (United States)

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Tinnitus: pathology of synaptic plasticity at the cellular and system levels

    Directory of Open Access Journals (Sweden)

    Matthieu J Guitton

    2012-03-01

    Full Text Available Despite being more and more common, and having a high impact on the quality of life of sufferers, tinnitus does not yet have a cure. This has been mostly the result of limited knowledge of the biological mechanisms underlying this adverse pathology. However, the last decade has witnessed tremendous progress in our understanding on the pathophysiology of tinnitus. Animal models have demonstrated that tinnitus is a pathology of neural plasticity, and has two main components: a molecular, peripheral component related to the initiation phase of tinnitus; and a system-level, central component related to the long-term maintenance of tinnitus. Using the most recent experimental data and the molecular/system dichotomy as a framework, we describe here the biological basis of tinnitus. We then discuss these mechanisms from an evolutionary perspective, highlighting similarities with memory. Finally, we consider how these discoveries can translate into therapies, and we suggest operative strategies to design new and effective combined therapeutic solutions using both pharmacological (local and systemic and behavioral tools (e.g., using tele-medicine and virtual reality settings.

  17. System-Level Testing of the Advanced Stirling Radioisotope Generator Engineering Hardware

    Science.gov (United States)

    Chan, Jack; Wiser, Jack; Brown, Greg; Florin, Dominic; Oriti, Salvatore M.

    2014-01-01

    To support future NASA deep space missions, a radioisotope power system utilizing Stirling power conversion technology was under development. This development effort was performed under the joint sponsorship of the Department of Energy and NASA, until its termination at the end of 2013 due to budget constraints. The higher conversion efficiency of the Stirling cycle compared with that of the Radioisotope Thermoelectric Generators (RTGs) used in previous missions (Viking, Pioneer, Voyager, Galileo, Ulysses, Cassini, Pluto New Horizons and Mars Science Laboratory) offers the advantage of a four-fold reduction in Pu-238 fuel, thereby extending its limited domestic supply. As part of closeout activities, system-level testing of flight-like Advanced Stirling Convertors (ASCs) with a flight-like ASC Controller Unit (ACU) was performed in February 2014. This hardware is the most representative of the flight design tested to date. The test fully demonstrates the following ACU and system functionality: system startup; ASC control and operation at nominal and worst-case operating conditions; power rectification; DC output power management throughout nominal and out-of-range host voltage levels; ACU fault management, and system command / telemetry via MIL-STD 1553 bus. This testing shows the viability of such a system for future deep space missions and bolsters confidence in the maturity of the flight design.

  18. System Level Analysis of a Water PCM HX Integrated into Orion's Thermal Control System

    Science.gov (United States)

    Navarro, Moses; Hansen, Scott; Seth, Rubik; Ungar, Eugene

    2015-01-01

    In a cyclical heat load environment such as low Lunar orbit, a spacecraft's radiators are not sized to reject the full heat load requirement. Traditionally, a supplemental heat rejection device (SHReD) such as an evaporator or sublimator is used to act as a "topper" to meet the additional heat rejection demands. Utilizing a Phase Change Material (PCM) heat exchanger (HX) as a SHReD provides an attractive alternative to evaporators and sublimators as PCM HXs do not use a consumable, thereby leading to reduced launch mass and volume requirements. In continued pursuit of water PCM HX development an Orion system level analysis was performed using Thermal Desktop for a water PCM HX integrated into Orion's thermal control system in a 100km Lunar orbit. The study verified of the thermal model by using a wax PCM and analyzed 1) placing the PCM on the Internal Thermal Control System (ITCS) versus the External Thermal Control System (ETCS) 2) use of 30/70 PGW verses 50/50 PGW and 3) increasing the radiator area in order to reduce PCM freeze times. The analysis showed that for the assumed operating and boundary conditions utilizing a water PCM HX on Orion is not a viable option for any case. Additionally, it was found that the radiator area would have to be increased by at least 40% in order to support a viable water-based PCM HX.

  19. System Level Analysis of a Water PCM HX Integrated Into Orion's Thermal Control System Abstract

    Science.gov (United States)

    Navarro, Moses; Hansen, Scott; Ungar, Eugene; Sheth, Rubik

    2015-01-01

    In a cyclical heat load environment such as low Lunar orbit, a spacecraft's radiators are not sized to reject the full heat load requirement. Traditionally, a supplemental heat rejection device (SHReD) such as an evaporator or sublimator is used to act as a "topper" to meet the additional heat rejection demands. Utilizing a Phase Change Material (PCM) heat exchanger (HX) as a SHReD provides an attractive alternative to evaporators and sublimators as PCM HXs do not use a consumable, thereby leading to reduced launch mass and volume requirements. In continued pursuit of water PCM HX development an Orion system level analysis was performed using Thermal Desktop for a water PCM HX integrated into Orion's thermal control system and in a 100km Lunar orbit. The study analyzed 1) placing the PCM on the Internal Thermal Control System (ITCS) versus the External Thermal Control System (ETCS) 2) use of 30/70 PGW verses 50/50 PGW and 3) increasing the radiator area in order to reduce PCM freeze times. The analysis showed that for the assumed operating and boundary conditions utilizing a water PCM HX on Orion is not a viable option. Additionally, it was found that the radiator area would have to be increased over 20% in order to have a viable water-based PCM HX.

  20. Interventions to Support System-level Implementation of Health Promoting Schools: A Scoping Review

    Directory of Open Access Journals (Sweden)

    Jessie-Lee D. McIsaac

    2016-02-01

    Full Text Available Health promoting schools (HPS is recognized globally as a multifaceted approach that can support health behaviours. There is increasing clarity around factors that influence HPS at a school level but limited synthesized knowledge on the broader system-level elements that may impact local implementation barriers and support uptake of a HPS approach. This study comprised a scoping review to identify, summarise and disseminate the range of research to support the uptake of a HPS approach across school systems. Two reviewers screened and extracted data according to inclusion/exclusion criteria. Relevant studies were identified using a multi-phased approach including searching electronic bibliographic databases of peer reviewed literature, hand-searching reference lists and article recommendations from experts. In total, 41 articles met the inclusion criteria for the review, representing studies across nine international school systems. Overall, studies described policies that provided high-level direction and resources within school jurisdictions to support implementation of a HPS approach. Various multifaceted organizational and professional interventions were identified, including strategies to enable and restructure school environments through education, training, modelling and incentives. A systematic realist review of the literature may be warranted to identify the types of intervention that work best for whom, in what circumstance to create healthier schools and students.

  1. Calibration and Evaluation of Fixed and Mobile Relay-Based System Level Simulator

    Directory of Open Access Journals (Sweden)

    Shahid Mumtaz

    2010-01-01

    Full Text Available Future wireless communication systems are expected to provide more stable and higher data rate transmissions in the whole OFDMA networks, but the mobile stations (MSs in the cell boundary experience poor spectral efficiency due to the path loss from the transmitting antenna and interference from adjacent cells. Therefore, satisfying QoS (Quality of Service requirements of each MS at the cell boundary has been an important issue. To resolve this spectral efficiency problem at the cell boundary, deploying relay stations has been actively considered. As multihop/relay has complex interactions between the routing and medium access control decisions, the extent to which analytical expressions can be used to explore its benefits is limited. Consequently, simulations tend to be the preferred way of assessing the performance of relays. In this paper, we evaluate the performance of relay-assisted OFDMA networks by means of system level simulator (SLS. We consistently observed that the throughput is increased and the outage is decreased in the relay-assisted OFDMA network, which is converted to range extension without any capacity penalty, for the realistic range of values of the propagation and other system parameters investigated.

  2. On-Site Renewable Energy and Green Buildings: A System-Level Analysis.

    Science.gov (United States)

    Al-Ghamdi, Sami G; Bilec, Melissa M

    2016-05-03

    Adopting a green building rating system (GBRSs) that strongly considers use of renewable energy can have important environmental consequences, particularly in developing countries. In this paper, we studied on-site renewable energy and GBRSs at the system level to explore potential benefits and challenges. While we have focused on GBRSs, the findings can offer additional insight for renewable incentives across sectors. An energy model was built for 25 sites to compute the potential solar and wind power production on-site and available within the building footprint and regional climate. A life-cycle approach and cost analysis were then completed to analyze the environmental and economic impacts. Environmental impacts of renewable energy varied dramatically between sites, in some cases, the environmental benefits were limited despite the significant economic burden of those renewable systems on-site and vice versa. Our recommendation for GBRSs, and broader policies and regulations, is to require buildings with higher environmental impacts to achieve higher levels of energy performance and on-site renewable energy utilization, instead of fixed percentages.

  3. Metabolic Compartmentation – A System Level Property of Muscle Cells

    Directory of Open Access Journals (Sweden)

    Theo Wallimann

    2008-05-01

    Full Text Available Problems of quantitative investigation of intracellular diffusion and compartmentation of metabolites are analyzed. Principal controversies in recently published analyses of these problems for the living cells are discussed. It is shown that the formal theoretical analysis of diffusion of metabolites based on Fick’s equation and using fixed diffusion coefficients for diluted homogenous aqueous solutions, but applied for biological systems in vivo without any comparison with experimental results, may lead to misleading conclusions, which are contradictory to most biological observations. However, if the same theoretical methods are used for analysis of actual experimental data, the apparent diffusion constants obtained are orders of magnitude lower than those in diluted aqueous solutions. Thus, it can be concluded that local restrictions of diffusion of metabolites in a cell are a system-level properties caused by complex structural organization of the cells, macromolecular crowding, cytoskeletal networks and organization of metabolic pathways into multienzyme complexes and metabolons. This results in microcompartmentation of metabolites, their channeling between enzymes and in modular organization of cellular metabolic networks. The perspectives of further studies of these complex intracellular interactions in the framework of Systems Biology are discussed.

  4. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  5. Heightened systemic levels of neutrophil and eosinophil granular proteins in pulmonary tuberculosis and reversal following treatment.

    Science.gov (United States)

    Moideen, Kadar; Kumar, Nathella Pavan; Nair, Dina; Banurekha, Vaithilingam V; Bethunaickan, Ramalingam; Babu, Subash

    2018-04-09

    Granulocytes are activated during tuberculosis (TB) infection and act as immune effector cells and granulocyte responses are implicated in TB pathogenesis. Plasma levels of neutrophil and eosinophil granular proteins provide an indirect measure of degranulation. In this study, we wanted to examine the levels of neutrophil and eosinophil granular proteins in individuals with pulmonary tuberculosis (PTB) and to compare them with the levels in latent TB (LTB) individuals. Hence, we measured the plasma levels of myeloperoxidase (MPO), neutrophil elastase, and proteinase-3; major basic protein (MBP), eosinophil derived neurotoxin (EDN), eosinophil cationic protein (ECP) and eosinophil peroxidase (EPX) in these individuals. Finally, we also measured the levels of all of these parameters in PTB individuals following anti-tuberculosis (ATT) treatment. Our data reveal that PTB individuals are characterized by significantly higher plasma levels of MPO, elastase, human proteinase 3 as well as MBP and EDN in comparison to LTB individuals. Our data also reveal that ATT resulted in reversal of all of these changes, indicating an association with TB disease. Finally, our data show that the systemic levels of MPO and proteinase-3 can significantly discriminate PTB from LTB individuals. Thus, our data suggest that neutrophil and eosinophil granular proteins could play a potential role in the innate immune response and therefore, the pathogenesis of pulmonary TB. Copyright © 2018 American Society for Microbiology.

  6. Interventions to Support System-level Implementation of Health Promoting Schools: A Scoping Review

    Science.gov (United States)

    McIsaac, Jessie-Lee D.; Hernandez, Kimberley J.; Kirk, Sara F.L.; Curran, Janet A.

    2016-01-01

    Health promoting schools (HPS) is recognized globally as a multifaceted approach that can support health behaviours. There is increasing clarity around factors that influence HPS at a school level but limited synthesized knowledge on the broader system-level elements that may impact local implementation barriers and support uptake of a HPS approach. This study comprised a scoping review to identify, summarise and disseminate the range of research to support the uptake of a HPS approach across school systems. Two reviewers screened and extracted data according to inclusion/exclusion criteria. Relevant studies were identified using a multi-phased approach including searching electronic bibliographic databases of peer reviewed literature, hand-searching reference lists and article recommendations from experts. In total, 41 articles met the inclusion criteria for the review, representing studies across nine international school systems. Overall, studies described policies that provided high-level direction and resources within school jurisdictions to support implementation of a HPS approach. Various multifaceted organizational and professional interventions were identified, including strategies to enable and restructure school environments through education, training, modelling and incentives. A systematic realist review of the literature may be warranted to identify the types of intervention that work best for whom, in what circumstance to create healthier schools and students. PMID:26861376

  7. Annotating pathogenic non-coding variants in genic regions.

    Science.gov (United States)

    Gelfman, Sahar; Wang, Quanli; McSweeney, K Melodi; Ren, Zhong; La Carpia, Francesca; Halvorsen, Matt; Schoch, Kelly; Ratzon, Fanni; Heinzen, Erin L; Boland, Michael J; Petrovski, Slavé; Goldstein, David B

    2017-08-09

    Identifying the underlying causes of disease requires accurate interpretation of genetic variants. Current methods ineffectively capture pathogenic non-coding variants in genic regions, resulting in overlooking synonymous and intronic variants when searching for disease risk. Here we present the Transcript-inferred Pathogenicity (TraP) score, which uses sequence context alterations to reliably identify non-coding variation that causes disease. High TraP scores single out extremely rare variants with lower minor allele frequencies than missense variants. TraP accurately distinguishes known pathogenic and benign variants in synonymous (AUC = 0.88) and intronic (AUC = 0.83) public datasets, dismissing benign variants with exceptionally high specificity. TraP analysis of 843 exomes from epilepsy family trios identifies synonymous variants in known epilepsy genes, thus pinpointing risk factors of disease from non-coding sequence data. TraP outperforms leading methods in identifying non-coding variants that are pathogenic and is therefore a valuable tool for use in gene discovery and the interpretation of personal genomes.While non-coding synonymous and intronic variants are often not under strong selective constraint, they can be pathogenic through affecting splicing or transcription. Here, the authors develop a score that uses sequence context alterations to predict pathogenicity of synonymous and non-coding genetic variants, and provide a web server of pre-computed scores.

  8. High Energy Transport Code HETC

    International Nuclear Information System (INIS)

    Gabriel, T.A.

    1985-09-01

    The physics contained in the High Energy Transport Code (HETC), in particular the collision models, are discussed. An application using HETC as part of the CALOR code system is also given. 19 refs., 5 figs., 3 tabs

  9. Code stroke in Asturias.

    Science.gov (United States)

    Benavente, L; Villanueva, M J; Vega, P; Casado, I; Vidal, J A; Castaño, B; Amorín, M; de la Vega, V; Santos, H; Trigo, A; Gómez, M B; Larrosa, D; Temprano, T; González, M; Murias, E; Calleja, S

    2016-04-01

    Intravenous thrombolysis with alteplase is an effective treatment for ischaemic stroke when applied during the first 4.5 hours, but less than 15% of patients have access to this technique. Mechanical thrombectomy is more frequently able to recanalise proximal occlusions in large vessels, but the infrastructure it requires makes it even less available. We describe the implementation of code stroke in Asturias, as well as the process of adapting various existing resources for urgent stroke care in the region. By considering these resources, and the demographic and geographic circumstances of our region, we examine ways of reorganising the code stroke protocol that would optimise treatment times and provide the most appropriate treatment for each patient. We distributed the 8 health districts in Asturias so as to permit referral of candidates for reperfusion therapies to either of the 2 hospitals with 24-hour stroke units and on-call neurologists and providing IV fibrinolysis. Hospitals were assigned according to proximity and stroke severity; the most severe cases were immediately referred to the hospital with on-call interventional neurology care. Patient triage was provided by pre-hospital emergency services according to the NIHSS score. Modifications to code stroke in Asturias have allowed us to apply reperfusion therapies with good results, while emphasising equitable care and managing the severity-time ratio to offer the best and safest treatment for each patient as soon as possible. Copyright © 2015 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  10. Decoding Xing-Ling codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2002-01-01

    This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed.......This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed....

  11. WWER reactor physics code applications

    International Nuclear Information System (INIS)

    Gado, J.; Kereszturi, A.; Gacs, A.; Telbisz, M.

    1994-01-01

    The coupled steady-state reactor physics and thermohydraulic code system KARATE has been developed and applied for WWER-1000 and WWER-440 operational calculations. The 3 D coupled kinetic code KIKO3D has been developed and validated for WWER-440 accident analysis applications. The coupled kinetic code SMARTA developed by VTT Helsinki has been applied for WWER-440 accident analysis. The paper gives a summary of the experience in code development and application. (authors). 10 refs., 2 tabs., 5 figs

  12. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  13. The CORSYS neutronics code system

    International Nuclear Information System (INIS)

    Caner, M.; Krumbein, A.D.; Saphier, D.; Shapira, M.

    1994-01-01

    The purpose of this work is to assemble a code package for LWR core physics including coupled neutronics, burnup and thermal hydraulics. The CORSYS system is built around the cell code WIMS (for group microscopic cross section calculations) and 3-dimension diffusion code CITATION (for burnup and fuel management). We are implementing such a system on an IBM RS-6000 workstation. The code was rested with a simplified model of the Zion Unit 2 PWR. (authors). 6 refs., 8 figs., 1 tabs

  14. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M. Jr.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  15. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially-available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  16. Quick response codes in Orthodontics

    Directory of Open Access Journals (Sweden)

    Moidin Shakil

    2015-01-01

    Full Text Available Quick response (QR code codes are two-dimensional barcodes, which encodes for a large amount of information. QR codes in Orthodontics are an innovative approach in which patient details, radiographic interpretation, and treatment plan can be encoded. Implementing QR code in Orthodontics will save time, reduces paperwork, and minimizes manual efforts in storage and retrieval of patient information during subsequent stages of treatment.

  17. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  18. Cinder begin creative coding

    CERN Document Server

    Rijnieks, Krisjanis

    2013-01-01

    Presented in an easy to follow, tutorial-style format, this book will lead you step-by-step through the multi-faceted uses of Cinder.""Cinder: Begin Creative Coding"" is for people who already have experience in programming. It can serve as a transition from a previous background in Processing, Java in general, JavaScript, openFrameworks, C++ in general or ActionScript to the framework covered in this book, namely Cinder. If you like quick and easy to follow tutorials that will let yousee progress in less than an hour - this book is for you. If you are searching for a book that will explain al

  19. UNSPEC: revisited (semaphore code)

    International Nuclear Information System (INIS)

    Neifert, R.D.

    1981-01-01

    The UNSPEC code is used to solve the problem of unfolding an observed x-ray spectrum given the response matrix of the measuring system and the measured signal values. UNSPEC uses an iterative technique to solve the unfold problem. Due to experimental errors in the measured signal values and/or computer round-off errors, discontinuities and oscillatory behavior may occur in the iterated spectrum. These can be suppressed by smoothing the results after each iteration. Input/output options and control cards are explained; sample input and output are provided

  20. The FLIC conversion codes

    International Nuclear Information System (INIS)

    Basher, J.C.

    1965-05-01

    This report describes the FORTRAN programmes, FLIC 1 and FLIC 2. These programmes convert programmes coded in one dialect of FORTRAN to another dialect of the same language. FLIC 1 is a general pattern recognition and replacement programme whereas FLIC 2 contains extensions directed towards the conversion of FORTRAN II and S2 programmes to EGTRAN 1 - the dialect now in use on the Winfrith KDF9. FII or S2 statements are replaced where possible by their E1 equivalents; other statements which may need changing are flagged. (author)