WorldWideScience

Sample records for system-level genetic codes

  1. Symmetries in Genetic Systems and the Concept of Geno-Logical Coding

    Directory of Open Access Journals (Sweden)

    Sergey V. Petoukhov

    2016-12-01

    Full Text Available The genetic code of amino acid sequences in proteins does not allow understanding and modeling of inherited processes such as inborn coordinated motions of living bodies, innate principles of sensory information processing, quasi-holographic properties, etc. To be able to model these phenomena, the concept of geno-logical coding, which is connected with logical functions and Boolean algebra, is put forward. The article describes basic pieces of evidence in favor of the existence of the geno-logical code, which exists in p­arallel with the known genetic code of amino acid sequences but which serves for transferring inherited processes along chains of generations. These pieces of evidence have been received due to the analysis of symmetries in structures of molecular-genetic systems. The analysis has revealed a close connection of the genetic system with dyadic groups of binary numbers and with other mathematical objects, which are related with dyadic groups: Walsh functions (which are algebraic characters of dyadic groups, bit-reversal permutations, logical holography, etc. These results provide a new approach for mathematical modeling of genetic structures, which uses known mathematical formalisms from technological fields of noise-immunity coding of information, binary analysis, logical holography, and digital devices of artificial intellect. Some opportunities for a development of algebraic-logical biology are opened.

  2. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  3. Codon size reduction as the origin of the triplet genetic code.

    Directory of Open Access Journals (Sweden)

    Pavel V Baranov

    Full Text Available The genetic code appears to be optimized in its robustness to missense errors and frameshift errors. In addition, the genetic code is near-optimal in terms of its ability to carry information in addition to the sequences of encoded proteins. As evolution has no foresight, optimality of the modern genetic code suggests that it evolved from less optimal code variants. The length of codons in the genetic code is also optimal, as three is the minimal nucleotide combination that can encode the twenty standard amino acids. The apparent impossibility of transitions between codon sizes in a discontinuous manner during evolution has resulted in an unbending view that the genetic code was always triplet. Yet, recent experimental evidence on quadruplet decoding, as well as the discovery of organisms with ambiguous and dual decoding, suggest that the possibility of the evolution of triplet decoding from living systems with non-triplet decoding merits reconsideration and further exploration. To explore this possibility we designed a mathematical model of the evolution of primitive digital coding systems which can decode nucleotide sequences into protein sequences. These coding systems can evolve their nucleotide sequences via genetic events of Darwinian evolution, such as point-mutations. The replication rates of such coding systems depend on the accuracy of the generated protein sequences. Computer simulations based on our model show that decoding systems with codons of length greater than three spontaneously evolve into predominantly triplet decoding systems. Our findings suggest a plausible scenario for the evolution of the triplet genetic code in a continuous manner. This scenario suggests an explanation of how protein synthesis could be accomplished by means of long RNA-RNA interactions prior to the emergence of the complex decoding machinery, such as the ribosome, that is required for stabilization and discrimination of otherwise weak triplet codon

  4. Genetic coding and gene expression - new Quadruplet genetic coding model

    Science.gov (United States)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  5. What Froze the Genetic Code?

    Directory of Open Access Journals (Sweden)

    Lluís Ribas de Pouplana

    2017-04-01

    Full Text Available The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  6. What Froze the Genetic Code?

    Science.gov (United States)

    Ribas de Pouplana, Lluís; Torres, Adrian Gabriel; Rafels-Ybern, Àlbert

    2017-04-05

    The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  7. CMCpy: Genetic Code-Message Coevolution Models in Python

    Science.gov (United States)

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  8. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  9. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  10. Arbitrariness is not enough: towards a functional approach to the genetic code.

    Science.gov (United States)

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  11. Genetic coding and united-hypercomplex systems in the models of algebraic biology.

    Science.gov (United States)

    Petoukhov, Sergey V

    2017-08-01

    Structured alphabets of DNA and RNA in their matrix form of representations are connected with Walsh functions and a new type of systems of multidimensional numbers. This type generalizes systems of complex numbers and hypercomplex numbers, which serve as the basis of mathematical natural sciences and many technologies. The new systems of multi-dimensional numbers have interesting mathematical properties and are called in a general case as "systems of united-hypercomplex numbers" (or briefly "U-hypercomplex numbers"). They can be widely used in models of multi-parametrical systems in the field of algebraic biology, artificial life, devices of biological inspired artificial intelligence, etc. In particular, an application of U-hypercomplex numbers reveals hidden properties of genetic alphabets under cyclic permutations in their doublets and triplets. A special attention is devoted to the author's hypothesis about a multi-linguistic in DNA-sequences in a relation with an ensemble of U-numerical sub-alphabets. Genetic multi-linguistic is considered as an important factor to provide noise-immunity properties of the multi-channel genetic coding. Our results attest to the conformity of the algebraic properties of the U-numerical systems with phenomenological properties of the DNA-alphabets and with the complementary device of the double DNA-helix. It seems that in the modeling field of algebraic biology the genetic-informational organization of living bodies can be considered as a set of united-hypercomplex numbers in some association with the famous slogan of Pythagoras "the numbers rule the world". Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A multiobjective approach to the genetic code adaptability problem.

    Science.gov (United States)

    de Oliveira, Lariza Laura; de Oliveira, Paulo S L; Tinós, Renato

    2015-02-19

    The organization of the canonical code has intrigued researches since it was first described. If we consider all codes mapping the 64 codes into 20 amino acids and one stop codon, there are more than 1.51×10(84) possible genetic codes. The main question related to the organization of the genetic code is why exactly the canonical code was selected among this huge number of possible genetic codes. Many researchers argue that the organization of the canonical code is a product of natural selection and that the code's robustness against mutations would support this hypothesis. In order to investigate the natural selection hypothesis, some researches employ optimization algorithms to identify regions of the genetic code space where best codes, according to a given evaluation function, can be found (engineering approach). The optimization process uses only one objective to evaluate the codes, generally based on the robustness for an amino acid property. Only one objective is also employed in the statistical approach for the comparison of the canonical code with random codes. We propose a multiobjective approach where two or more objectives are considered simultaneously to evaluate the genetic codes. In order to test our hypothesis that the multiobjective approach is useful for the analysis of the genetic code adaptability, we implemented a multiobjective optimization algorithm where two objectives are simultaneously optimized. Using as objectives the robustness against mutation with the amino acids properties polar requirement (objective 1) and robustness with respect to hydropathy index or molecular volume (objective 2), we found solutions closer to the canonical genetic code in terms of robustness, when compared with the results using only one objective reported by other authors. Using more objectives, more optimal solutions are obtained and, as a consequence, more information can be used to investigate the adaptability of the genetic code. The multiobjective approach

  13. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments.

    Science.gov (United States)

    Santos, José; Monteagudo, Angel

    2011-02-21

    As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the fact that the best possible codes show the patterns of the

  14. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments

    Directory of Open Access Journals (Sweden)

    Monteagudo Ángel

    2011-02-01

    Full Text Available Abstract Background As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Results Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Conclusions Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the

  15. Next-generation mammalian genetics toward organism-level systems biology.

    Science.gov (United States)

    Susaki, Etsuo A; Ukai, Hideki; Ueda, Hiroki R

    2017-01-01

    Organism-level systems biology in mammals aims to identify, analyze, control, and design molecular and cellular networks executing various biological functions in mammals. In particular, system-level identification and analysis of molecular and cellular networks can be accelerated by next-generation mammalian genetics. Mammalian genetics without crossing, where all production and phenotyping studies of genome-edited animals are completed within a single generation drastically reduce the time, space, and effort of conducting the systems research. Next-generation mammalian genetics is based on recent technological advancements in genome editing and developmental engineering. The process begins with introduction of double-strand breaks into genomic DNA by using site-specific endonucleases, which results in highly efficient genome editing in mammalian zygotes or embryonic stem cells. By using nuclease-mediated genome editing in zygotes, or ~100% embryonic stem cell-derived mouse technology, whole-body knock-out and knock-in mice can be produced within a single generation. These emerging technologies allow us to produce multiple knock-out or knock-in strains in high-throughput manner. In this review, we discuss the basic concepts and related technologies as well as current challenges and future opportunities for next-generation mammalian genetics in organism-level systems biology.

  16. Psacoin level 1A intercomparison probabilistic system assessment code (PSAC) user group

    International Nuclear Information System (INIS)

    Nies, A.; Laurens, J.M.; Galson, D.A.; Webster, S.

    1990-01-01

    This report describes an international code intercomparison exercise conducted by the NEA Probabilistic System Assessment Code (PSAC) User Group. The PSACOIN Level 1A exercise is the third of a series designed to contribute to the verification of probabilistic codes that may be used in assessing the safety of radioactive waste disposal systems or concepts. Level 1A is based on a more realistic system model than that used in the two previous exercises, and involves deep geological disposal concepts with a relatively complex structure of the repository vault. The report compares results and draws conclusions with regard to the use of different modelling approaches and the possible importance to safety of various processes within and around a deep geological repository. In particular, the relative significance of model uncertainty and data variability is discussed

  17. The coevolution of genes and genetic codes: Crick's frozen accident revisited.

    Science.gov (United States)

    Sella, Guy; Ardell, David H

    2006-09-01

    The standard genetic code is the nearly universal system for the translation of genes into proteins. The code exhibits two salient structural characteristics: it possesses a distinct organization that makes it extremely robust to errors in replication and translation, and it is highly redundant. The origin of these properties has intrigued researchers since the code was first discovered. One suggestion, which is the subject of this review, is that the code's organization is the outcome of the coevolution of genes and genetic codes. In 1968, Francis Crick explored the possible implications of coevolution at different stages of code evolution. Although he argues that coevolution was likely to influence the evolution of the code, he concludes that it falls short of explaining the organization of the code we see today. The recent application of mathematical modeling to study the effects of errors on the course of coevolution, suggests a different conclusion. It shows that coevolution readily generates genetic codes that are highly redundant and similar in their error-correcting organization to the standard code. We review this recent work and suggest that further affirmation of the role of coevolution can be attained by investigating the extent to which the outcome of coevolution is robust to other influences that were present during the evolution of the code.

  18. Computation of the Genetic Code

    Science.gov (United States)

    Kozlov, Nicolay N.; Kozlova, Olga N.

    2018-03-01

    One of the problems in the development of mathematical theory of the genetic code (summary is presented in [1], the detailed -to [2]) is the problem of the calculation of the genetic code. Similar problems in the world is unknown and could be delivered only in the 21st century. One approach to solving this problem is devoted to this work. For the first time provides a detailed description of the method of calculation of the genetic code, the idea of which was first published earlier [3]), and the choice of one of the most important sets for the calculation was based on an article [4]. Such a set of amino acid corresponds to a complete set of representations of the plurality of overlapping triple gene belonging to the same DNA strand. A separate issue was the initial point, triggering an iterative search process all codes submitted by the initial data. Mathematical analysis has shown that the said set contains some ambiguities, which have been founded because of our proposed compressed representation of the set. As a result, the developed method of calculation was limited to the two main stages of research, where the first stage only the of the area were used in the calculations. The proposed approach will significantly reduce the amount of computations at each step in this complex discrete structure.

  19. Evolutionary implications of genetic code deviations

    International Nuclear Information System (INIS)

    Chela Flores, J.

    1986-07-01

    By extending the standard genetic code into a temperature dependent regime, we propose a train of molecular events leading to alternative coding. The first few examples of these deviations have already been reported in some ciliated protozoans and Gram positive bacteria. A possible range of further alternative coding, still within the context of universality, is pointed out. (author)

  20. Flexibility of the genetic code with respect to DNA structure

    DEFF Research Database (Denmark)

    Baisnée, P. F.; Baldi, Pierre; Brunak, Søren

    2001-01-01

    Motivation. The primary function of DNA is to carry genetic information through the genetic code. DNA, however, contains a variety of other signals related, for instance, to reading frame, codon bias, pairwise codon bias, splice sites and transcription regulation, nucleosome positioning and DNA...... structure. Here we study the relationship between the genetic code and DNA structure and address two questions. First, to which degree does the degeneracy of the genetic code and the acceptable amino acid substitution patterns allow for the superimposition of DNA structural signals to protein coding...... sequences? Second, is the origin or evolution of the genetic code likely to have been constrained by DNA structure? Results. We develop an index for code flexibility with respect to DNA structure. Using five different di- or tri-nucleotide models of sequence-dependent DNA structure, we show...

  1. Mathematical fundamentals for the noise immunity of the genetic code.

    Science.gov (United States)

    Fimmel, Elena; Strüngmann, Lutz

    2018-02-01

    Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of

  2. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  3. The evolution of the mitochondrial genetic code in arthropods revisited.

    Science.gov (United States)

    Abascal, Federico; Posada, David; Zardoya, Rafael

    2012-04-01

    A variant of the invertebrate mitochondrial genetic code was previously identified in arthropods (Abascal et al. 2006a, PLoS Biol 4:e127) in which, instead of translating the AGG codon as serine, as in other invertebrates, some arthropods translate AGG as lysine. Here, we revisit the evolution of the genetic code in arthropods taking into account that (1) the number of arthropod mitochondrial genomes sequenced has triplicated since the original findings were published; (2) the phylogeny of arthropods has been recently resolved with confidence for many groups; and (3) sophisticated probabilistic methods can be applied to analyze the evolution of the genetic code in arthropod mitochondria. According to our analyses, evolutionary shifts in the genetic code have been more common than previously inferred, with many taxonomic groups displaying two alternative codes. Ancestral character-state reconstruction using probabilistic methods confirmed that the arthropod ancestor most likely translated AGG as lysine. Point mutations at tRNA-Lys and tRNA-Ser correlated with the meaning of the AGG codon. In addition, we identified three variables (GC content, number of AGG codons, and taxonomic information) that best explain the use of each of the two alternative genetic codes.

  4. A genetic code alteration is a phenotype diversity generator in the human pathogen Candida albicans.

    Directory of Open Access Journals (Sweden)

    Isabel Miranda

    Full Text Available BACKGROUND: The discovery of genetic code alterations and expansions in both prokaryotes and eukaryotes abolished the hypothesis of a frozen and universal genetic code and exposed unanticipated flexibility in codon and amino acid assignments. It is now clear that codon identity alterations involve sense and non-sense codons and can occur in organisms with complex genomes and proteomes. However, the biological functions, the molecular mechanisms of evolution and the diversity of genetic code alterations remain largely unknown. In various species of the genus Candida, the leucine CUG codon is decoded as serine by a unique serine tRNA that contains a leucine 5'-CAG-3'anticodon (tRNA(CAG(Ser. We are using this codon identity redefinition as a model system to elucidate the evolution of genetic code alterations. METHODOLOGY/PRINCIPAL FINDINGS: We have reconstructed the early stages of the Candida genetic code alteration by engineering tRNAs that partially reverted the identity of serine CUG codons back to their standard leucine meaning. Such genetic code manipulation had profound cellular consequences as it exposed important morphological variation, altered gene expression, re-arranged the karyotype, increased cell-cell adhesion and secretion of hydrolytic enzymes. CONCLUSION/SIGNIFICANCE: Our study provides the first experimental evidence for an important role of genetic code alterations as generators of phenotypic diversity of high selective potential and supports the hypothesis that they speed up evolution of new phenotypes.

  5. National Society of Genetic Counselors Code of Ethics.

    Science.gov (United States)

    2018-02-01

    This document is the revised Code of Ethics of the National Society of Genetic Counselors (NSGC) that was adopted in April 2017 after majority vote of the full membership of the NSGC. The explication of the revisions is published in this volume of the Journal of Genetic Counseling. This is the fourth revision to the Code of Ethics since its original adoption in 1992.

  6. Synthetic alienation of microbial organisms by using genetic code engineering: Why and how?

    Science.gov (United States)

    Kubyshkin, Vladimir; Budisa, Nediljko

    2017-08-01

    The main goal of synthetic biology (SB) is the creation of biodiversity applicable for biotechnological needs, while xenobiology (XB) aims to expand the framework of natural chemistries with the non-natural building blocks in living cells to accomplish artificial biodiversity. Protein and proteome engineering, which overcome limitation of the canonical amino acid repertoire of 20 (+2) prescribed by the genetic code by using non-canonic amino acids (ncAAs), is one of the main focuses of XB research. Ideally, estranging the genetic code from its current form via systematic introduction of ncAAs should enable the development of bio-containment mechanisms in synthetic cells potentially endowing them with a "genetic firewall" i.e. orthogonality which prevents genetic information transfer to natural systems. Despite rapid progress over the past two decades, it is not yet possible to completely alienate an organism that would use and maintain different genetic code associations permanently. In order to engineer robust bio-contained life forms, the chemical logic behind the amino acid repertoire establishment should be considered. Starting from recent proposal of Hartman and Smith about the genetic code establishment in the RNA world, here the authors mapped possible biotechnological invasion points for engineering of bio-contained synthetic cells equipped with non-canonical functionalities. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. How American Nurses Association Code of Ethics informs genetic/genomic nursing.

    Science.gov (United States)

    Tluczek, Audrey; Twal, Marie E; Beamer, Laura Curr; Burton, Candace W; Darmofal, Leslie; Kracun, Mary; Zanni, Karen L; Turner, Martha

    2018-01-01

    Members of the Ethics and Public Policy Committee of the International Society of Nurses in Genetics prepared this article to assist nurses in interpreting the American Nurses Association (2015) Code of Ethics for Nurses with Interpretive Statements (Code) within the context of genetics/genomics. The Code explicates the nursing profession's norms and responsibilities in managing ethical issues. The nearly ubiquitous application of genetic/genomic technologies in healthcare poses unique ethical challenges for nursing. Therefore, authors conducted literature searches that drew from various professional resources to elucidate implications of the code in genetic/genomic nursing practice, education, research, and public policy. We contend that the revised Code coupled with the application of genomic technologies to healthcare creates moral obligations for nurses to continually refresh their knowledge and capacities to translate genetic/genomic research into evidence-based practice, assure the ethical conduct of scientific inquiry, and continually develop or revise national/international guidelines that protect the rights of individuals and populations within the context of genetics/genomics. Thus, nurses have an ethical responsibility to remain knowledgeable about advances in genetics/genomics and incorporate emergent evidence into their work.

  8. The genetic code as a periodic table: algebraic aspects.

    Science.gov (United States)

    Bashford, J D; Jarvis, P D

    2000-01-01

    The systematics of indices of physico-chemical properties of codons and amino acids across the genetic code are examined. Using a simple numerical labelling scheme for nucleic acid bases, A=(-1,0), C=(0,-1), G=(0,1), U=(1,0), data can be fitted as low order polynomials of the six coordinates in the 64-dimensional codon weight space. The work confirms and extends the recent studies by Siemion et al. (1995. BioSystems 36, 231-238) of the conformational parameters. Fundamental patterns in the data such as codon periodicities, and related harmonics and reflection symmetries, are here associated with the structure of the set of basis monomials chosen for fitting. Results are plotted using the Siemion one-step mutation ring scheme, and variants thereof. The connections between the present work, and recent studies of the genetic code structure using dynamical symmetry algebras, are pointed out.

  9. The Impact of Diagnostic Code Misclassification on Optimizing the Experimental Design of Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Steven J. Schrodi

    2017-01-01

    Full Text Available Diagnostic codes within electronic health record systems can vary widely in accuracy. It has been noted that the number of instances of a particular diagnostic code monotonically increases with the accuracy of disease phenotype classification. As a growing number of health system databases become linked with genomic data, it is critically important to understand the effect of this misclassification on the power of genetic association studies. Here, I investigate the impact of this diagnostic code misclassification on the power of genetic association studies with the aim to better inform experimental designs using health informatics data. The trade-off between (i reduced misclassification rates from utilizing additional instances of a diagnostic code per individual and (ii the resulting smaller sample size is explored, and general rules are presented to improve experimental designs.

  10. On coding genotypes for genetic markers with multiple alleles in genetic association study of quantitative traits

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2011-09-01

    Full Text Available Abstract Background In genetic association study of quantitative traits using F∞ models, how to code the marker genotypes and interpret the model parameters appropriately is important for constructing hypothesis tests and making statistical inferences. Currently, the coding of marker genotypes in building F∞ models has mainly focused on the biallelic case. A thorough work on the coding of marker genotypes and interpretation of model parameters for F∞ models is needed especially for genetic markers with multiple alleles. Results In this study, we will formulate F∞ genetic models under various regression model frameworks and introduce three genotype coding schemes for genetic markers with multiple alleles. Starting from an allele-based modeling strategy, we first describe a regression framework to model the expected genotypic values at given markers. Then, as extension from the biallelic case, we introduce three coding schemes for constructing fully parameterized one-locus F∞ models and discuss the relationships between the model parameters and the expected genotypic values. Next, under a simplified modeling framework for the expected genotypic values, we consider several reduced one-locus F∞ models from the three coding schemes on the estimability and interpretation of their model parameters. Finally, we explore some extensions of the one-locus F∞ models to two loci. Several fully parameterized as well as reduced two-locus F∞ models are addressed. Conclusions The genotype coding schemes provide different ways to construct F∞ models for association testing of multi-allele genetic markers with quantitative traits. Which coding scheme should be applied depends on how convenient it can provide the statistical inferences on the parameters of our research interests. Based on these F∞ models, the standard regression model fitting tools can be used to estimate and test for various genetic effects through statistical contrasts with the

  11. A Real-Coded Genetic Algorithm with System Reduction and Restoration for Rapid and Reliable Power Flow Solution of Power Systems

    Directory of Open Access Journals (Sweden)

    Hassan Abdullah Kubba

    2015-05-01

    Full Text Available The paper presents a highly accurate power flow solution, reducing the possibility of ending at local minima, by using Real-Coded Genetic Algorithm (RCGA with system reduction and restoration. The proposed method (RCGA is modified to reduce the total computing time by reducing the system in size to that of the generator buses, which, for any realistic system, will be smaller in number, and the load buses are eliminated. Then solving the power flow problem for the generator buses only by real-coded GA to calculate the voltage phase angles, whereas the voltage magnitudes are specified resulted in reduced computation time for the solution. Then the system is restored by calculating the voltages of the load buses in terms of the calculated voltages of the generator buses, after a derivation of equations for calculating the voltages of the load busbars. The proposed method was demonstrated on 14-bus IEEE test systems and the practical system 362-busbar IRAQI NATIONAL GRID (ING. The proposed method has reliable convergence, a highly accurate solution and less computing time for on-line applications. The method can conveniently be applied for on-line analysis and planning studies of large power systems.

  12. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang

    2011-06-07

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  13. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang; Yu, Jun

    2011-01-01

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  14. Genetic hotels for the standard genetic code: evolutionary analysis based upon novel three-dimensional algebraic models.

    Science.gov (United States)

    José, Marco V; Morgado, Eberto R; Govezensky, Tzipe

    2011-07-01

    Herein, we rigorously develop novel 3-dimensional algebraic models called Genetic Hotels of the Standard Genetic Code (SGC). We start by considering the primeval RNA genetic code which consists of the 16 codons of type RNY (purine-any base-pyrimidine). Using simple algebraic operations, we show how the RNA code could have evolved toward the current SGC via two different intermediate evolutionary stages called Extended RNA code type I and II. By rotations or translations of the subset RNY, we arrive at the SGC via the former (type I) or via the latter (type II), respectively. Biologically, the Extended RNA code type I, consists of all codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The Extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. Since the dimensions of remarkable subsets of the Genetic Hotels are not necessarily integer numbers, we also introduce the concept of algebraic fractal dimension. A general decoding function which maps each codon to its corresponding amino acid or the stop signals is also derived. The Phenotypic Hotel of amino acids is also illustrated. The proposed evolutionary paths are discussed in terms of the existing theories of the evolution of the SGC. The adoption of 3-dimensional models of the Genetic and Phenotypic Hotels will facilitate the understanding of the biological properties of the SGC.

  15. Analysis of genetic code ambiguity arising from nematode-specific misacylated tRNAs.

    Directory of Open Access Journals (Sweden)

    Kiyofumi Hamashima

    Full Text Available The faithful translation of the genetic code requires the highly accurate aminoacylation of transfer RNAs (tRNAs. However, it has been shown that nematode-specific V-arm-containing tRNAs (nev-tRNAs are misacylated with leucine in vitro in a manner that transgresses the genetic code. nev-tRNA(Gly (CCC and nev-tRNA(Ile (UAU, which are the major nev-tRNA isotypes, could theoretically decode the glycine (GGG codon and isoleucine (AUA codon as leucine, causing GGG and AUA codon ambiguity in nematode cells. To test this hypothesis, we investigated the functionality of nev-tRNAs and their impact on the proteome of Caenorhabditis elegans. Analysis of the nucleotide sequences in the 3' end regions of the nev-tRNAs showed that they had matured correctly, with the addition of CCA, which is a crucial posttranscriptional modification required for tRNA aminoacylation. The nuclear export of nev-tRNAs was confirmed with an analysis of their subcellular localization. These results show that nev-tRNAs are processed to their mature forms like common tRNAs and are available for translation. However, a whole-cell proteome analysis found no detectable level of nev-tRNA-induced mistranslation in C. elegans cells, suggesting that the genetic code is not ambiguous, at least under normal growth conditions. Our findings indicate that the translational fidelity of the nematode genetic code is strictly maintained, contrary to our expectations, although deviant tRNAs with misacylation properties are highly conserved in the nematode genome.

  16. Interdependence, Reflexivity, Fidelity, Impedance Matching, and the Evolution of Genetic Coding

    Science.gov (United States)

    Carter, Charles W; Wills, Peter R

    2018-01-01

    Abstract Genetic coding is generally thought to have required ribozymes whose functions were taken over by polypeptide aminoacyl-tRNA synthetases (aaRS). Two discoveries about aaRS and their interactions with tRNA substrates now furnish a unifying rationale for the opposite conclusion: that the key processes of the Central Dogma of molecular biology emerged simultaneously and naturally from simple origins in a peptide•RNA partnership, eliminating the epistemological utility of a prior RNA world. First, the two aaRS classes likely arose from opposite strands of the same ancestral gene, implying a simple genetic alphabet. The resulting inversion symmetries in aaRS structural biology would have stabilized the initial and subsequent differentiation of coding specificities, rapidly promoting diversity in the proteome. Second, amino acid physical chemistry maps onto tRNA identity elements, establishing reflexive, nanoenvironmental sensing in protein aaRS. Bootstrapping of increasingly detailed coding is thus intrinsic to polypeptide aaRS, but impossible in an RNA world. These notions underline the following concepts that contradict gradual replacement of ribozymal aaRS by polypeptide aaRS: 1) aaRS enzymes must be interdependent; 2) reflexivity intrinsic to polypeptide aaRS production dynamics promotes bootstrapping; 3) takeover of RNA-catalyzed aminoacylation by enzymes will necessarily degrade specificity; and 4) the Central Dogma’s emergence is most probable when replication and translation error rates remain comparable. These characteristics are necessary and sufficient for the essentially de novo emergence of a coupled gene–replicase–translatase system of genetic coding that would have continuously preserved the functional meaning of genetically encoded protein genes whose phylogenetic relationships match those observed today. PMID:29077934

  17. On decoding of multi-level MPSK modulation codes

    Science.gov (United States)

    Lin, Shu; Gupta, Alok Kumar

    1990-01-01

    The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.

  18. Critical roles for a genetic code alteration in the evolution of the genus Candida.

    Science.gov (United States)

    Silva, Raquel M; Paredes, João A; Moura, Gabriela R; Manadas, Bruno; Lima-Costa, Tatiana; Rocha, Rita; Miranda, Isabel; Gomes, Ana C; Koerkamp, Marian J G; Perrot, Michel; Holstege, Frank C P; Boucherie, Hélian; Santos, Manuel A S

    2007-10-31

    During the last 30 years, several alterations to the standard genetic code have been discovered in various bacterial and eukaryotic species. Sense and nonsense codons have been reassigned or reprogrammed to expand the genetic code to selenocysteine and pyrrolysine. These discoveries highlight unexpected flexibility in the genetic code, but do not elucidate how the organisms survived the proteome chaos generated by codon identity redefinition. In order to shed new light on this question, we have reconstructed a Candida genetic code alteration in Saccharomyces cerevisiae and used a combination of DNA microarrays, proteomics and genetics approaches to evaluate its impact on gene expression, adaptation and sexual reproduction. This genetic manipulation blocked mating, locked yeast in a diploid state, remodelled gene expression and created stress cross-protection that generated adaptive advantages under environmental challenging conditions. This study highlights unanticipated roles for codon identity redefinition during the evolution of the genus Candida, and strongly suggests that genetic code alterations create genetic barriers that speed up speciation.

  19. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  20. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  1. A probabilistic assessment code system for derivation of clearance levels of radioactive materials. PASCLR user's manual

    International Nuclear Information System (INIS)

    Takahashi, Tomoyuki; Takeda, Seiji; Kimura, Hideo

    2001-01-01

    It is indicated that some types of radioactive material generating from the development and utilization of nuclear energy do not need to be subject regulatory control because they can only give rise to trivial radiation hazards. The process to remove such materials from regulatory control is called as 'clearance'. The corresponding levels of the concentration of radionuclides are called as 'clearance levels'. In the Nuclear Safety Commission's discussion, the deterministic approach was applied to derive the clearance levels, which are the concentrations of radionuclides in a cleared material equivalent to an individual dose criterion. Basically, realistic parameter values were selected for it. If the realistic values could not be defined, reasonably conservative values were selected. Additionally, the stochastic approaches were performed to validate the results which were obtained by the deterministic calculations. We have developed a computer code system PASCLR (Probabilistic Assessment code System for derivation of Clearance Levels of Radioactive materials) by using the Monte Carlo technique for carrying out the stochastic calculations. This report describes the structure and user information for execution of PASCLR code. (author)

  2. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    International Nuclear Information System (INIS)

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites

  3. Real coded genetic algorithm for fuzzy time series prediction

    Science.gov (United States)

    Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.

    2017-10-01

    Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.

  4. The Genetic Code: Yesterday, Today and Tomorrow

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 17; Issue 12. The Genetic Code: Yesterday, Today and Tomorrow. Jiqiang Ling Dieter Söll. General Article Volume 17 Issue 12 December 2012 pp 1136-1142. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. System Level Evaluation of Innovative Coded MIMO-OFDM Systems for Broadcasting Digital TV

    Directory of Open Access Journals (Sweden)

    Y. Nasser

    2008-01-01

    Full Text Available Single-frequency networks (SFNs for broadcasting digital TV is a topic of theoretical and practical interest for future broadcasting systems. Although progress has been made in the characterization of its description, there are still considerable gaps in its deployment with MIMO technique. The contribution of this paper is multifold. First, we investigate the possibility of applying a space-time (ST encoder between the antennas of two sites in SFN. Then, we introduce a 3D space-time-space block code for future terrestrial digital TV in SFN architecture. The proposed 3D code is based on a double-layer structure designed for intercell and intracell space time-coded transmissions. Eventually, we propose to adapt a technique called effective exponential signal-to-noise ratio (SNR mapping (EESM to predict the bit error rate (BER at the output of the channel decoder in the MIMO systems. The EESM technique as well as the simulations results will be used to doubly check the efficiency of our 3D code. This efficiency is obtained for equal and unequal received powers whatever is the location of the receiver by adequately combining ST codes. The 3D code is then a very promising candidate for SFN architecture with MIMO transmission.

  6. The "Wow! signal" of the terrestrial genetic code

    Science.gov (United States)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of

  7. Mapping the Plasticity of the E. coli Genetic Code with Orthogonal Pair Directed Sense Codon Reassignment.

    Science.gov (United States)

    Schmitt, Margaret A; Biddle, Wil; Fisk, John Domenic

    2018-04-18

    The relative quantitative importance of the factors that determine the fidelity of translation is largely unknown, which makes predicting the extent to which the degeneracy of the genetic code can be broken challenging. Our strategy of using orthogonal tRNA/aminoacyl tRNA synthetase pairs to precisely direct the incorporation of a single amino acid in response to individual sense and nonsense codons provides a suite of related data with which to examine the plasticity of the code. Each directed sense codon reassignment measurement is an in vivo competition experiment between the introduced orthogonal translation machinery and the natural machinery in E. coli. This report discusses 20 new, related genetic codes, in which a targeted E. coli wobble codon is reassigned to tyrosine utilizing the orthogonal tyrosine tRNA/aminoacyl tRNA synthetase pair from Methanocaldococcus jannaschii. One at a time, reassignment of each targeted sense codon to tyrosine is quantified in cells by measuring the fluorescence of GFP variants in which the essential tyrosine residue is encoded by a non-tyrosine codon. Significantly, every wobble codon analyzed may be partially reassigned with efficiencies ranging from 0.8% to 41%. The accumulation of the suite of data enables a qualitative dissection of the relative importance of the factors affecting the fidelity of translation. While some correlation was observed between sense codon reassignment and either competing endogenous tRNA abundance or changes in aminoacylation efficiency of the altered orthogonal system, no single factor appears to predominately drive translational fidelity. Evaluation of relative cellular fitness in each of the 20 quantitatively-characterized proteome-wide tyrosine substitution systems suggests that at a systems level, E. coli is robust to missense mutations.

  8. Representation mutations from standard genetic codes

    Science.gov (United States)

    Aisah, I.; Suyudi, M.; Carnia, E.; Suhendi; Supriatna, A. K.

    2018-03-01

    Graph is widely used in everyday life especially to describe model problem and describe it concretely and clearly. In addition graph is also used to facilitate solve various kinds of problems that are difficult to be solved by calculation. In Biology, graph can be used to describe the process of protein synthesis in DNA. Protein has an important role for DNA (deoxyribonucleic acid) or RNA (ribonucleic acid). Proteins are composed of amino acids. In this study, amino acids are related to genetics, especially the genetic code. The genetic code is also known as the triplet or codon code which is a three-letter arrangement of DNA nitrogen base. The bases are adenine (A), thymine (T), guanine (G) and cytosine (C). While on RNA thymine (T) is replaced with Urasil (U). The set of all Nitrogen bases in RNA is denoted by N = {C U, A, G}. This codon works at the time of protein synthesis inside the cell. This codon also encodes the stop signal as a sign of the stop of protein synthesis process. This paper will examine the process of protein synthesis through mathematical studies and present it in three-dimensional space or graph. The study begins by analysing the set of all codons denoted by NNN such that to obtain geometric representations. At this stage there is a matching between the sets of all nitrogen bases N with Z 2 × Z 2; C=(\\overline{0},\\overline{0}),{{U}}=(\\overline{0},\\overline{1}),{{A}}=(\\overline{1},\\overline{0}),{{G}}=(\\overline{1},\\overline{1}). By matching the algebraic structure will be obtained such as group, group Klein-4,Quotien group etc. With the help of Geogebra software, the set of all codons denoted by NNN can be presented in a three-dimensional space as a multicube NNN and also can be represented as a graph, so that can easily see relationship between the codon.

  9. Use of fluorescent proteins and color-coded imaging to visualize cancer cells with different genetic properties.

    Science.gov (United States)

    Hoffman, Robert M

    2016-03-01

    Fluorescent proteins are very bright and available in spectrally-distinct colors, enable the imaging of color-coded cancer cells growing in vivo and therefore the distinction of cancer cells with different genetic properties. Non-invasive and intravital imaging of cancer cells with fluorescent proteins allows the visualization of distinct genetic variants of cancer cells down to the cellular level in vivo. Cancer cells with increased or decreased ability to metastasize can be distinguished in vivo. Gene exchange in vivo which enables low metastatic cancer cells to convert to high metastatic can be color-coded imaged in vivo. Cancer stem-like and non-stem cells can be distinguished in vivo by color-coded imaging. These properties also demonstrate the vast superiority of imaging cancer cells in vivo with fluorescent proteins over photon counting of luciferase-labeled cancer cells.

  10. Efficient Dual Domain Decoding of Linear Block Codes Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed Azouaoui

    2012-01-01

    Full Text Available A computationally efficient algorithm for decoding block codes is developed using a genetic algorithm (GA. The proposed algorithm uses the dual code in contrast to the existing genetic decoders in the literature that use the code itself. Hence, this new approach reduces the complexity of decoding the codes of high rates. We simulated our algorithm in various transmission channels. The performance of this algorithm is investigated and compared with competitor decoding algorithms including Maini and Shakeel ones. The results show that the proposed algorithm gives large gains over the Chase-2 decoding algorithm and reach the performance of the OSD-3 for some quadratic residue (QR codes. Further, we define a new crossover operator that exploits the domain specific information and compare it with uniform and two point crossover. The complexity of this algorithm is also discussed and compared to other algorithms.

  11. Inclusion of the fitness sharing technique in an evolutionary algorithm to analyze the fitness landscape of the genetic code adaptability.

    Science.gov (United States)

    Santos, José; Monteagudo, Ángel

    2017-03-27

    The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.

  12. HOW TO REPRESENT THE GENETIC CODE?

    Directory of Open Access Journals (Sweden)

    N.S. Santos-Magalhães

    2004-05-01

    Full Text Available The advent of molecular genetic comprises a true revolution of far-reaching consequences for human-kind, which evolved into a specialized branch of the modern-day Biochemistry. The analysis of specicgenomic information are gaining wide-ranging interest because of their signicance to the early diag-nosis of disease, and the discovery of modern drugs. In order to take advantage of a wide assortmentof signal processing (SP algorithms, the primary step of modern genomic SP involves convertingsymbolic-DNA sequences into complex-valued signals. How to represent the genetic code? Despitebeing extensively known, the DNA mapping into proteins is one of the relevant discoveries of genetics.The genetic code (GC is revisited in this work, addressing other descriptions for it, which can beworthy for genomic SP. Three original representations are discussed. The inner-to-outer map buildson the unbalanced role of nucleotides of a codon. A two-dimensional-Gray genetic representationis oered as a structured map that can help interpreting DNA spectrograms or scalograms. Theseare among the powerful visual tools for genome analysis, which depends on the choice of the geneticmapping. Finally, the world-chart for the GC is investigated. Evoking the cyclic structure of thegenetic mapping, it can be folded joining the left-right borders, and the top-bottom frontiers. As aresult, the GC can be drawn on the surface of a sphere resembling a world-map. Eight parallels oflatitude are required (four in each hemisphere as well as four meridians of longitude associated tofour corresponding anti-meridians. The tropic circles have 11.25o, 33.75o, 56.25o, and 78.5o (Northand South. Starting from an arbitrary Greenwich meridian, the meridians of longitude can be plottedat 22.5o, 67.5o, 112.5o, and 157.5o (East and West. Each triplet is assigned to a single point on thesurface that we named Nirenberg-Kohamas Earth. Despite being valuable, usual representations forthe GC can be

  13. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  14. A probabilistic assessment code system for derivation of clearance levels of radioactive materials. PASCLR user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Tomoyuki [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst; Takeda, Seiji; Kimura, Hideo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-01-01

    It is indicated that some types of radioactive material generating from the development and utilization of nuclear energy do not need to be subject regulatory control because they can only give rise to trivial radiation hazards. The process to remove such materials from regulatory control is called as 'clearance'. The corresponding levels of the concentration of radionuclides are called as 'clearance levels'. In the Nuclear Safety Commission's discussion, the deterministic approach was applied to derive the clearance levels, which are the concentrations of radionuclides in a cleared material equivalent to an individual dose criterion. Basically, realistic parameter values were selected for it. If the realistic values could not be defined, reasonably conservative values were selected. Additionally, the stochastic approaches were performed to validate the results which were obtained by the deterministic calculations. We have developed a computer code system PASCLR (Probabilistic Assessment code System for derivation of Clearance Levels of Radioactive materials) by using the Monte Carlo technique for carrying out the stochastic calculations. This report describes the structure and user information for execution of PASCLR code. (author)

  15. Programming peptidomimetic syntheses by translating genetic codes designed de novo.

    Science.gov (United States)

    Forster, Anthony C; Tan, Zhongping; Nalam, Madhavi N L; Lin, Hening; Qu, Hui; Cornish, Virginia W; Blacklow, Stephen C

    2003-05-27

    Although the universal genetic code exhibits only minor variations in nature, Francis Crick proposed in 1955 that "the adaptor hypothesis allows one to construct, in theory, codes of bewildering variety." The existing code has been expanded to enable incorporation of a variety of unnatural amino acids at one or two nonadjacent sites within a protein by using nonsense or frameshift suppressor aminoacyl-tRNAs (aa-tRNAs) as adaptors. However, the suppressor strategy is inherently limited by compatibility with only a small subset of codons, by the ways such codons can be combined, and by variation in the efficiency of incorporation. Here, by preventing competing reactions with aa-tRNA synthetases, aa-tRNAs, and release factors during translation and by using nonsuppressor aa-tRNA substrates, we realize a potentially generalizable approach for template-encoded polymer synthesis that unmasks the substantially broader versatility of the core translation apparatus as a catalyst. We show that several adjacent, arbitrarily chosen sense codons can be completely reassigned to various unnatural amino acids according to de novo genetic codes by translating mRNAs into specific peptide analog polymers (peptidomimetics). Unnatural aa-tRNA substrates do not uniformly function as well as natural substrates, revealing important recognition elements for the translation apparatus. Genetic programming of peptidomimetic synthesis should facilitate mechanistic studies of translation and may ultimately enable the directed evolution of small molecules with desirable catalytic or pharmacological properties.

  16. National Society of Genetic Counselors Code of Ethics: Explication of 2017 Revisions.

    Science.gov (United States)

    Senter, Leigha; Bennett, Robin L; Madeo, Anne C; Noblin, Sarah; Ormond, Kelly E; Schneider, Kami Wolfe; Swan, Kelli; Virani, Alice

    2018-02-01

    The Code of Ethics (COE) of the National Society of Genetic Counselors (NSGC) was adopted in 1992 and was later revised and adopted in 2006. In 2016, the NSGC Code of Ethics Review Task Force (COERTF) was convened to review the COE. The COERTF reviewed ethical codes written by other professional organizations and suggested changes that would better reflect the current and evolving nature of the genetic counseling profession. The COERTF received input from the society's legal counsel, Board of Directors, and members-at-large. A revised COE was proposed to the membership and approved and adopted in April 2017. The revisions and rationale for each are presented.

  17. The "periodic table" of the genetic code: A new way to look at the code and the decoding process.

    Science.gov (United States)

    Komar, Anton A

    2016-01-01

    Henri Grosjean and Eric Westhof recently presented an information-rich, alternative view of the genetic code, which takes into account current knowledge of the decoding process, including the complex nature of interactions between mRNA, tRNA and rRNA that take place during protein synthesis on the ribosome, and it also better reflects the evolution of the code. The new asymmetrical circular genetic code has a number of advantages over the traditional codon table and the previous circular diagrams (with a symmetrical/clockwise arrangement of the U, C, A, G bases). Most importantly, all sequence co-variances can be visualized and explained based on the internal logic of the thermodynamics of codon-anticodon interactions.

  18. Origins of gene, genetic code, protein and life

    Indian Academy of Sciences (India)

    Unknown

    have concluded that newly-born genes are products of nonstop frames (NSF) ... research to determine tertiary structures of proteins such ... the present earth, is favourable for new genes to arise, if ..... NGG) in the universal genetic code table, cannot satisfy ..... which has been proposed to explain the development of life on.

  19. Advanced thermionic reactor systems design code

    International Nuclear Information System (INIS)

    Lewis, B.R.; Pawlowski, R.A.; Greek, K.J.; Klein, A.C.

    1991-01-01

    An overall systems design code is under development to model an advanced in-core thermionic nuclear reactor system for space applications at power levels of 10 to 50 kWe. The design code is written in an object-oriented programming environment that allows the use of a series of design modules, each of which is responsible for the determination of specific system parameters. The code modules include a neutronics and core criticality module, a core thermal hydraulics module, a thermionic fuel element performance module, a radiation shielding module, a module for waste heat transfer and rejection, and modules for power conditioning and control. The neutronics and core criticality module determines critical core size, core lifetime, and shutdown margins using the criticality calculation capability of the Monte Carlo Neutron and Photon Transport Code System (MCNP). The remaining modules utilize results of the MCNP analysis along with FORTRAN programming to predict the overall system performance

  20. A Simple Test of Class-Level Genetic Association Can Reveal Novel Cardiometabolic Trait Loci.

    Directory of Open Access Journals (Sweden)

    Jing Qian

    Full Text Available Characterizing the genetic determinants of complex diseases can be further augmented by incorporating knowledge of underlying structure or classifications of the genome, such as newly developed mappings of protein-coding genes, epigenetic marks, enhancer elements and non-coding RNAs.We apply a simple class-level testing framework, termed Genetic Class Association Testing (GenCAT, to identify protein-coding gene association with 14 cardiometabolic (CMD related traits across 6 publicly available genome wide association (GWA meta-analysis data resources. GenCAT uses SNP-level meta-analysis test statistics across all SNPs within a class of elements, as well as the size of the class and its unique correlation structure, to determine if the class is statistically meaningful. The novelty of findings is evaluated through investigation of regional signals. A subset of findings are validated using recently updated, larger meta-analysis resources. A simulation study is presented to characterize overall performance with respect to power, control of family-wise error and computational efficiency. All analysis is performed using the GenCAT package, R version 3.2.1.We demonstrate that class-level testing complements the common first stage minP approach that involves individual SNP-level testing followed by post-hoc ascribing of statistically significant SNPs to genes and loci. GenCAT suggests 54 protein-coding genes at 41 distinct loci for the 13 CMD traits investigated in the discovery analysis, that are beyond the discoveries of minP alone. An additional application to biological pathways demonstrates flexibility in defining genetic classes.We conclude that it would be prudent to include class-level testing as standard practice in GWA analysis. GenCAT, for example, can be used as a simple, complementary and efficient strategy for class-level testing that leverages existing data resources, requires only summary level data in the form of test statistics, and

  1. Quantum control using genetic algorithms in quantum communication: superdense coding

    International Nuclear Information System (INIS)

    Domínguez-Serna, Francisco; Rojas, Fernando

    2015-01-01

    We present a physical example model of how Quantum Control with genetic algorithms is applied to implement the quantum superdense code protocol. We studied a model consisting of two quantum dots with an electron with spin, including spin-orbit interaction. The electron and the spin get hybridized with the site acquiring two degrees of freedom, spin and charge. The system has tunneling and site energies as time dependent control parameters that are optimized by means of genetic algorithms to prepare a hybrid Bell-like state used as a transmission channel. This state is transformed to obtain any state of the four Bell basis as required by superdense protocol to transmit two bits of classical information. The control process protocol is equivalent to implement one of the quantum gates in the charge subsystem. Fidelities larger than 99.5% are achieved for the hybrid entangled state preparation and the superdense operations. (paper)

  2. A symbiotic liaison between the genetic and epigenetic code

    Directory of Open Access Journals (Sweden)

    Holger eHeyn

    2014-05-01

    Full Text Available With rapid advances in sequencing technologies, we are undergoing a paradigm shift from hypothesis- to data-driven research. Genome-wide profiling efforts gave informative insights into biological processes; however, considering the wealth of variation, the major challenge remains their meaningful interpretation. In particular sequence variation in non-coding contexts is often challenging to interpret. Here, data integration approaches for the identification of functional genetic variability represent a likely solution. Exemplary, functional linkage analysis integrating genotype and expression data determined regulatory quantitative trait loci (QTL and proposed causal relationships. In addition to gene expression, epigenetic regulation and specifically DNA methylation was established as highly valuable surrogate mark for functional variance of the genetic code. Epigenetic modification served as powerful mediator trait to elucidate mechanisms forming phenotypes in health and disease. Particularly, integrative studies of genetic and DNA methylation data yet guided interpretation strategies of risk genotypes, but also proved their value for physiological traits, such as natural human variation and aging. This Perspective seeks to illustrate the power of data integration in the genomic era exemplified by DNA methylation quantitative trait loci (meQTLs. However, the model is further extendable to virtually all traceable molecular traits.

  3. Enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding for four-level holographic data storage systems

    Science.gov (United States)

    Kong, Gyuyeol; Choi, Sooyong

    2017-09-01

    An enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding is proposed for four-level holographic data storage systems. While the previous four-ary modulation codes focus on preventing maximum two-dimensional intersymbol interference patterns, the proposed four-ary modulation code aims at maximizing the coding gains for better bit error rate performances. For achieving significant coding gains from the four-ary modulation codes, we design a new 2/3 four-ary modulation code in order to enlarge the free distance on the trellis through extensive simulation. The free distance of the proposed four-ary modulation code is extended from 1.21 to 2.04 compared with that of the conventional four-ary modulation code. The simulation result shows that the proposed four-ary modulation code has more than 1 dB gains compared with the conventional four-ary modulation code.

  4. Time-Delay System Identification Using Genetic Algorithm

    DEFF Research Database (Denmark)

    Yang, Zhenyu; Seested, Glen Thane

    2013-01-01

    problem through an identification approach using the real coded Genetic Algorithm (GA). The desired FOPDT/SOPDT model is directly identified based on the measured system's input and output data. In order to evaluate the quality and performance of this GA-based approach, the proposed method is compared...

  5. The in-core fuel management code system for VVER reactors

    International Nuclear Information System (INIS)

    Cada, R.; Krysl, V.; Mikolas, P.; Sustek, J.; Svarny, J.

    2004-01-01

    The structure and methodology of a fuel management system for NPP VVER 1000 (NPP Temelin) and VVER 440 (NPP Dukovany) is described. It is under development in SKODA JS a.s. and is followed by practical applications. The general objectives of the system are maximization of end of cycle reactivity, the minimization of fresh fuel inventory for the minimization of fed enrichment and minimization of burnable poisons (BPs) inventory. They are also safety related constraints in witch minimization of power peaking plays a dominant role. General structure of the system consists in preparation of input data for macrocode calculation, algorithms (codes) for optimization of fuel loading, calculation of fuel enrichment and BPs assignment. At present core loading can be calculated (optimized) by Tabu search algorithm (code ATHENA), genetic algorithm (code Gen1) and hybrid algorithm - simplex procedure with application of Tabu search algorithm on binary shuffling (code OPAL B ). Enrichment search is realized by the application of simplex algorithm (OPAL B code) and BPs assignment by module BPASS and simplex algorithm in OPAL B code. Calculations of the real core loadings are presented and a comparison of different optimization methods is provided. (author)

  6. The Search for Symmetries in the Genetic Code:

    Science.gov (United States)

    Antoneli, Fernando; Forger, Michael; Hornos, José Eduardo M.

    We give a full classification of the possible schemes for obtaining the distribution of multiplets observed in the standard genetic code by symmetry breaking in the context of finite groups, based on an extended notion of partial symmetry breaking that incorporates the intuitive idea of "freezing" first proposed by Francis Crick, which is given a precise mathematical meaning.

  7. Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal

    Science.gov (United States)

    Zamudio, Gabriel S.; José, Marco V.

    2018-03-01

    In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.

  8. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  9. Frozen Accident Pushing 50: Stereochemistry, Expansion, and Chance in the Evolution of the Genetic Code.

    Science.gov (United States)

    Koonin, Eugene V

    2017-05-23

    Nearly 50 years ago, Francis Crick propounded the frozen accident scenario for the evolution of the genetic code along with the hypothesis that the early translation system consisted primarily of RNA. Under the frozen accident perspective, the code is universal among modern life forms because any change in codon assignment would be highly deleterious. The frozen accident can be considered the default theory of code evolution because it does not imply any specific interactions between amino acids and the cognate codons or anticodons, or any particular properties of the code. The subsequent 49 years of code studies have elucidated notable features of the standard code, such as high robustness to errors, but failed to develop a compelling explanation for codon assignments. In particular, stereochemical affinity between amino acids and the cognate codons or anticodons does not seem to account for the origin and evolution of the code. Here, I expand Crick's hypothesis on RNA-only translation system by presenting evidence that this early translation already attained high fidelity that allowed protein evolution. I outline an experimentally testable scenario for the evolution of the code that combines a distinct version of the stereochemical hypothesis, in which amino acids are recognized via unique sites in the tertiary structure of proto-tRNAs, rather than by anticodons, expansion of the code via proto-tRNA duplication, and the frozen accident.

  10. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  11. A bar coding system for environmental projects

    International Nuclear Information System (INIS)

    Barber, R.B.; Hunt, B.J.; Burgess, G.M.

    1988-01-01

    This paper presents BeCode systems, a bar coding system which provides both nuclear and commercial clients with a data capture and custody management program that is accurate, timely, and beneficial to all levels of project operations. Using bar code identifiers is an essentially paperless and error-free method which provides more efficient delivery of data through its menu card-driven structure, which speeds collection of essential data for uploading to a compatible device. The effects of this sequence include real-time information for operator analysis, management review, audits, planning, scheduling, and cost control

  12. The Graph, Geometry and Symmetries of the Genetic Code with Hamming Metric

    Directory of Open Access Journals (Sweden)

    Reijer Lenstra

    2015-07-01

    Full Text Available The similarity patterns of the genetic code result from similar codons encoding similar messages. We develop a new mathematical model to analyze these patterns. The physicochemical characteristics of amino acids objectively quantify their differences and similarities; the Hamming metric does the same for the 64 codons of the codon set. (Hamming distances equal the number of different codon positions: AAA and AAC are at 1-distance; codons are maximally at 3-distance. The CodonPolytope, a 9-dimensional geometric object, is spanned by 64 vertices that represent the codons and the Euclidian distances between these vertices correspond one-to-one with intercodon Hamming distances. The CodonGraph represents the vertices and edges of the polytope; each edge equals a Hamming 1-distance. The mirror reflection symmetry group of the polytope is isomorphic to the largest permutation symmetry group of the codon set that preserves Hamming distances. These groups contain 82,944 symmetries. Many polytope symmetries coincide with the degeneracy and similarity patterns of the genetic code. These code symmetries are strongly related with the face structure of the polytope with smaller faces displaying stronger code symmetries. Splitting the polytope stepwise into smaller faces models an early evolution of the code that generates this hierarchy of code symmetries. The canonical code represents a class of 41,472 codes with equivalent symmetries; a single class among an astronomical number of symmetry classes comprising all possible codes.

  13. Amino acid fermentation at the origin of the genetic code.

    Science.gov (United States)

    de Vladar, Harold P

    2012-02-10

    There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments

  14. Multi-level trellis coded modulation and multi-stage decoding

    Science.gov (United States)

    Costello, Daniel J., Jr.; Wu, Jiantian; Lin, Shu

    1990-01-01

    Several constructions for multi-level trellis codes are presented and many codes with better performance than previously known codes are found. These codes provide a flexible trade-off between coding gain, decoding complexity, and decoding delay. New multi-level trellis coded modulation schemes using generalized set partitioning methods are developed for Quadrature Amplitude Modulation (QAM) and Phase Shift Keying (PSK) signal sets. New rotationally invariant multi-level trellis codes which can be combined with differential encoding to resolve phase ambiguity are presented.

  15. Junk DNA and the long non-coding RNA twist in cancer genetics

    NARCIS (Netherlands)

    H. Ling (Hui); K. Vincent; M. Pichler; R. Fodde (Riccardo); I. Berindan-Neagoe (Ioana); F.J. Slack (Frank); G.A. Calin (George)

    2015-01-01

    textabstractThe central dogma of molecular biology states that the flow of genetic information moves from DNA to RNA to protein. However, in the last decade this dogma has been challenged by new findings on non-coding RNAs (ncRNAs) such as microRNAs (miRNAs). More recently, long non-coding RNAs

  16. Development of a BWR loading pattern design system based on modified genetic algorithms and knowledge

    International Nuclear Information System (INIS)

    Martin-del-Campo, Cecilia; Francois, Juan Luis; Avendano, Linda; Gonzalez, Mario

    2004-01-01

    An optimization system based on Genetic Algorithms (GAs), in combination with expert knowledge coded in heuristics rules, was developed for the design of optimized boiling water reactor (BWR) fuel loading patterns. The system was coded in a computer program named Loading Pattern Optimization System based on Genetic Algorithms, in which the optimization code uses GAs to select candidate solutions, and the core simulator code CM-PRESTO to evaluate them. A multi-objective function was built to maximize the cycle energy length while satisfying power and reactivity constraints used as BWR design parameters. Heuristic rules were applied to satisfy standard fuel management recommendations as the Control Cell Core and Low Leakage loading strategies, and octant symmetry. To test the system performance, an optimized cycle was designed and compared against an actual operating cycle of Laguna Verde Nuclear Power Plant, Unit I

  17. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    Science.gov (United States)

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. [Direct genetic manipulation and criminal code in Venezuela: absolute criminal law void?].

    Science.gov (United States)

    Cermeño Zambrano, Fernando G De J

    2002-01-01

    The judicial regulation of genetic biotechnology applied to the human genome is of big relevance currently in Venezuela due to the drafting of an innovative bioethical law in the country's parliament. This article will highlight the constitutional normative of Venezuela's 1999 Constitution regarding this subject, as it establishes the framework from which this matter will be legally regulated. The approach this article makes towards the genetic biotechnology applied to the human genome is made taking into account the Venezuelan penal law and by highlighting the violent genetic manipulations that have criminal relevance. The genetic biotechnology applied to the human genome has another important relevance as a consequence of the reformulation of the Venezuelan Penal Code discussed by the country's National Assembly. Therefore, a concise study of the country's penal code will be made in this article to better understand what judicial-penal properties have been protected by the Venezuelan penal legislation. This last step will enable us to identify the penal tools Venezuela counts on to face direct genetic manipulations. We will equally indicate the existing punitive loophole and that should be covered by the penal legislator. In conclusion, this essay concerns criminal policy, referred to the direct genetic manipulations on the human genome that haven't been typified in Venezuelan law, thus discovering a genetic biotechnology paradise.

  19. Decoding the non-coding genome: elucidating genetic risk outside the coding genome.

    Science.gov (United States)

    Barr, C L; Misener, V L

    2016-01-01

    Current evidence emerging from genome-wide association studies indicates that the genetic underpinnings of complex traits are likely attributable to genetic variation that changes gene expression, rather than (or in combination with) variation that changes protein-coding sequences. This is particularly compelling with respect to psychiatric disorders, as genetic changes in regulatory regions may result in differential transcriptional responses to developmental cues and environmental/psychosocial stressors. Until recently, however, the link between transcriptional regulation and psychiatric genetic risk has been understudied. Multiple obstacles have contributed to the paucity of research in this area, including challenges in identifying the positions of remote (distal from the promoter) regulatory elements (e.g. enhancers) and their target genes and the underrepresentation of neural cell types and brain tissues in epigenome projects - the availability of high-quality brain tissues for epigenetic and transcriptome profiling, particularly for the adolescent and developing brain, has been limited. Further challenges have arisen in the prediction and testing of the functional impact of DNA variation with respect to multiple aspects of transcriptional control, including regulatory-element interaction (e.g. between enhancers and promoters), transcription factor binding and DNA methylation. Further, the brain has uncommon DNA-methylation marks with unique genomic distributions not found in other tissues - current evidence suggests the involvement of non-CG methylation and 5-hydroxymethylation in neurodevelopmental processes but much remains unknown. We review here knowledge gaps as well as both technological and resource obstacles that will need to be overcome in order to elucidate the involvement of brain-relevant gene-regulatory variants in genetic risk for psychiatric disorders. © 2015 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  20. An Automatic Instruction-Level Parallelization of Machine Code

    Directory of Open Access Journals (Sweden)

    MARINKOVIC, V.

    2018-02-01

    Full Text Available Prevailing multicores and novel manycores have made a great challenge of modern day - parallelization of embedded software that is still written as sequential. In this paper, automatic code parallelization is considered, focusing on developing a parallelization tool at the binary level as well as on the validation of this approach. The novel instruction-level parallelization algorithm for assembly code which uses the register names after SSA to find independent blocks of code and then to schedule independent blocks using METIS to achieve good load balance is developed. The sequential consistency is verified and the validation is done by measuring the program execution time on the target architecture. Great speedup, taken as the performance measure in the validation process, and optimal load balancing are achieved for multicore RISC processors with 2 to 16 cores (e.g. MIPS, MicroBlaze, etc.. In particular, for 16 cores, the average speedup is 7.92x, while in some cases it reaches 14x. An approach to automatic parallelization provided by this paper is useful to researchers and developers in the area of parallelization as the basis for further optimizations, as the back-end of a compiler, or as the code parallelization tool for an embedded system.

  1. Tandem Mirror Reactor Systems Code (Version I)

    International Nuclear Information System (INIS)

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost

  2. Towards A Genetic Business Code For Growth in the South African Transport Industry

    Directory of Open Access Journals (Sweden)

    J.H. Vermeulen

    2003-11-01

    Full Text Available As with each living organism, it is proposed that an organisation possesses a genetic code. In the fast-changing business environment it would be invaluable to know what constitutes organisational growth and success in terms of such a code. To identify this genetic code a quantitative methodological framework, supplemented by a qualitative approach, was used and the views of top management in the Transport Industry were solicited. The Repertory Grid was used as the primary data-collection method. Through a phased data-analysis process an integrated profile of first- and second-order constructs, and opposite poles, was compiled. By utilising deductive and inductive strategies three strands of a Genetic Business Growth Code were identified, namely a Leadership Strand, Organisational Architecture Strand and Internal Orientation Strand. The study confirmed the value of a Genetic Business Code for growth in the Transport Industry. Opsomming Daar word voorgestel dat ’n organisasie, soos elke lewende organisme, oor ’n genetiese kode beskik. In die snelveranderende sake-omgewing sal dit onskatbaar wees om te weet wat organisasiegroei en –sukses veroorsaak. ’n Kwantitatiewe metodologie-raamwerk, aangevul deur ’n kwalitatiewe benadering is gebruik om hierdie genetiese kode te identifiseer, en die menings van topbestuur in die Vervoerbedryf is ingewin met behulp van die “Repertory Grid" as die vernaamste metode van data-insameling. ’n Geïntegreerde profiel van eerste- en tweedeordekonstrukte, met hulle teenoorgestelde pole, is opgestel. Drie stringe van ’n Genetiese Sakegroeikode, nl. ’n Leierskapstring, die Organisasieargitektuur-string en die Innerlike-ingesteldheidstring is geïdentifiseer deur deduktiewe en induktiewe strategieë te gebruik. Die studie bevestig die waarde van ’n Genetiese Sakekode vir groei in die Vervoerbedryf.

  3. Multi-stage decoding of multi-level modulation codes

    Science.gov (United States)

    Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.

    1991-01-01

    Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).

  4. Amino acid fermentation at the origin of the genetic code

    Science.gov (United States)

    2012-01-01

    There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments

  5. Amino acid fermentation at the origin of the genetic code

    Directory of Open Access Journals (Sweden)

    de Vladar Harold P

    2012-02-01

    Full Text Available Abstract There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can

  6. The role of crossover operator in evolutionary-based approach to the problem of genetic code optimization.

    Science.gov (United States)

    Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł

    2016-12-01

    One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure

  7. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  8. A Novel Real-coded Quantum-inspired Genetic Algorithm and Its Application in Data Reconciliation

    Directory of Open Access Journals (Sweden)

    Gao Lin

    2012-06-01

    Full Text Available Traditional quantum-inspired genetic algorithm (QGA has drawbacks such as premature convergence, heavy computational cost, complicated coding and decoding process etc. In this paper, a novel real-coded quantum-inspired genetic algorithm is proposed based on interval division thinking. Detailed comparisons with some similar approaches for some standard benchmark functions test validity of the proposed algorithm. Besides, the proposed algorithm is used in two typical nonlinear data reconciliation problems (distilling process and extraction process and simulation results show its efficiency in nonlinear data reconciliation problems.

  9. FitSKIRT: genetic algorithms to automatically fit dusty galaxies with a Monte Carlo radiative transfer code

    Science.gov (United States)

    De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.

    2013-02-01

    We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.

  10. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  11. Two-Level Semantics and Code Generation

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1988-01-01

    A two-level denotational metalanguage that is suitable for defining the semantics of Pascal-like languages is presented. The two levels allow for an explicit distinction between computations taking place at compile-time and computations taking place at run-time. While this distinction is perhaps...... not absolutely necessary for describing the input-output semantics of programming languages, it is necessary when issues such as data flow analysis and code generation are considered. For an example stack-machine, the authors show how to generate code for the run-time computations and still perform the compile...

  12. Numeral series hidden in the distribution of atomic mass of amino acids to codon domains in the genetic code.

    Science.gov (United States)

    Wohlin, Åsa

    2015-03-21

    The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  13. An enhancement of selection and crossover operations in real-coded genetic algorithm for large-dimensionality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Noh Sung; Lee, Jongsoo [Yonsei University, Seoul (Korea, Republic of)

    2016-01-15

    The present study aims to implement a new selection method and a novel crossover operation in a real-coded genetic algorithm. The proposed selection method facilitates the establishment of a successively evolved population by combining several subpopulations: an elitist subpopulation, an off-spring subpopulation and a mutated subpopulation. A probabilistic crossover is performed based on the measure of probabilistic distance between the individuals. The concept of ‘allowance’ is suggested to describe the level of variance in the crossover operation. A number of nonlinear/non-convex functions and engineering optimization problems are explored to verify the capacities of the proposed strategies. The results are compared with those obtained from other genetic and nature-inspired algorithms.

  14. A Chip-Level BSOR-Based Linear GSIC Multiuser Detector for Long-Code CDMA Systems

    Directory of Open Access Journals (Sweden)

    Benyoucef M

    2007-01-01

    Full Text Available We introduce a chip-level linear group-wise successive interference cancellation (GSIC multiuser structure that is asymptotically equivalent to block successive over-relaxation (BSOR iteration, which is known to outperform the conventional block Gauss-Seidel iteration by an order of magnitude in terms of convergence speed. The main advantage of the proposed scheme is that it uses directly the spreading codes instead of the cross-correlation matrix and thus does not require the calculation of the cross-correlation matrix (requires floating point operations (flops, where is the processing gain and is the number of users which reduces significantly the overall computational complexity. Thus it is suitable for long-code CDMA systems such as IS-95 and UMTS where the cross-correlation matrix is changing every symbol. We study the convergence behavior of the proposed scheme using two approaches and prove that it converges to the decorrelator detector if the over-relaxation factor is in the interval ]0, 2[. Simulation results are in excellent agreement with theory.

  15. A Chip-Level BSOR-Based Linear GSIC Multiuser Detector for Long-Code CDMA Systems

    Directory of Open Access Journals (Sweden)

    M. Benyoucef

    2008-01-01

    Full Text Available We introduce a chip-level linear group-wise successive interference cancellation (GSIC multiuser structure that is asymptotically equivalent to block successive over-relaxation (BSOR iteration, which is known to outperform the conventional block Gauss-Seidel iteration by an order of magnitude in terms of convergence speed. The main advantage of the proposed scheme is that it uses directly the spreading codes instead of the cross-correlation matrix and thus does not require the calculation of the cross-correlation matrix (requires 2NK2 floating point operations (flops, where N is the processing gain and K is the number of users which reduces significantly the overall computational complexity. Thus it is suitable for long-code CDMA systems such as IS-95 and UMTS where the cross-correlation matrix is changing every symbol. We study the convergence behavior of the proposed scheme using two approaches and prove that it converges to the decorrelator detector if the over-relaxation factor is in the interval ]0, 2[. Simulation results are in excellent agreement with theory.

  16. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  17. Multi-stage decoding for multi-level block modulation codes

    Science.gov (United States)

    Lin, Shu

    1991-01-01

    In this paper, we investigate various types of multi-stage decoding for multi-level block modulation codes, in which the decoding of a component code at each stage can be either soft-decision or hard-decision, maximum likelihood or bounded-distance. Error performance of codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. Based on our study and computation results, we find that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. In particular, we find that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum decoding of the overall code is very small: only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.

  18. Bi-level image compression with tree coding

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1996-01-01

    Presently, tree coders are the best bi-level image coders. The current ISO standard, JBIG, is a good example. By organising code length calculations properly a vast number of possible models (trees) can be investigated within reasonable time prior to generating code. Three general-purpose coders...... are constructed by this principle. A multi-pass free tree coding scheme produces superior compression results for all test images. A multi-pass fast free template coding scheme produces much better results than JBIG for difficult images, such as halftonings. Rissanen's algorithm `Context' is presented in a new...

  19. Lossy/lossless coding of bi-level images

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1997-01-01

    Summary form only given. We present improvements to a general type of lossless, lossy, and refinement coding of bi-level images (Martins and Forchhammer, 1996). Loss is introduced by flipping pixels. The pixels are coded using arithmetic coding of conditional probabilities obtained using a template...... as is known from JBIG and proposed in JBIG-2 (Martins and Forchhammer). Our new state-of-the-art results are obtained using the more general free tree instead of a template. Also we introduce multiple refinement template coding. The lossy algorithm is analogous to the greedy `rate...

  20. Genetic learning in rule-based and neural systems

    Science.gov (United States)

    Smith, Robert E.

    1993-01-01

    The design of neural networks and fuzzy systems can involve complex, nonlinear, and ill-conditioned optimization problems. Often, traditional optimization schemes are inadequate or inapplicable for such tasks. Genetic Algorithms (GA's) are a class of optimization procedures whose mechanics are based on those of natural genetics. Mathematical arguments show how GAs bring substantial computational leverage to search problems, without requiring the mathematical characteristics often necessary for traditional optimization schemes (e.g., modality, continuity, availability of derivative information, etc.). GA's have proven effective in a variety of search tasks that arise in neural networks and fuzzy systems. This presentation begins by introducing the mechanism and theoretical underpinnings of GA's. GA's are then related to a class of rule-based machine learning systems called learning classifier systems (LCS's). An LCS implements a low-level production-system that uses a GA as its primary rule discovery mechanism. This presentation illustrates how, despite its rule-based framework, an LCS can be thought of as a competitive neural network. Neural network simulator code for an LCS is presented. In this context, the GA is doing more than optimizing and objective function. It is searching for an ecology of hidden nodes with limited connectivity. The GA attempts to evolve this ecology such that effective neural network performance results. The GA is particularly well adapted to this task, given its naturally-inspired basis. The LCS/neural network analogy extends itself to other, more traditional neural networks. Conclusions to the presentation discuss the implications of using GA's in ecological search problems that arise in neural and fuzzy systems.

  1. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  2. ETR/ITER systems code

    International Nuclear Information System (INIS)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs

  3. Optimizing survivability of multi-state systems with multi-level protection by multi-processor genetic algorithm

    International Nuclear Information System (INIS)

    Levitin, Gregory; Dai Yuanshun; Xie Min; Leng Poh, Kim

    2003-01-01

    In this paper we consider vulnerable systems which can have different states corresponding to different combinations of available elements composing the system. Each state can be characterized by a performance rate, which is the quantitative measure of a system's ability to perform its task. Both the impact of external factors (stress) and internal causes (failures) affect system survivability, which is determined as probability of meeting a given demand. In order to increase the survivability of the system, a multi-level protection is applied to its subsystems. This means that a subsystem and its inner level of protection are in their turn protected by the protection of an outer level. This double-protected subsystem has its outer protection and so forth. In such systems, the protected subsystems can be destroyed only if all of the levels of their protection are destroyed. Each level of protection can be destroyed only if all of the outer levels of protection are destroyed. We formulate the problem of finding the structure of series-parallel multi-state system (including choice of system elements, choice of structure of multi-level protection and choice of protection methods) in order to achieve a desired level of system survivability by the minimal cost. An algorithm based on the universal generating function method is used for determination of the system survivability. A multi-processor version of genetic algorithm is used as optimization tool in order to solve the structure optimization problem. An application example is presented to illustrate the procedure presented in this paper

  4. On Francis Crick, the genetic code, and a clever kid.

    Science.gov (United States)

    Goldstein, Bob

    2018-04-02

    A few years ago, Francis Crick's son told me a story that I can't get out of my mind. I had contacted Michael Crick by email while digging through the background of the researchers who had cracked the genetic code in the 1960s. Francis had died in 2004, and I was contacting some of the people who knew him when he was struggling to decipher the code. Francis didn't appear to struggle often - he is known mostly for his successes - and, as it turns out, this one well-known struggle may have had a clue sitting just barely out of sight. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. From chemical metabolism to life: the origin of the genetic coding process

    Directory of Open Access Journals (Sweden)

    Antoine Danchin

    2017-06-01

    Full Text Available Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life.

  6. Genetic optimization of steam multi-turbines system

    International Nuclear Information System (INIS)

    Olszewski, Pawel

    2014-01-01

    Optimization analysis of partially loaded cogeneration, multiple-stages steam turbines system was numerically investigated by using own-developed code (C++). The system can be controlled by following variables: fresh steam temperature, pressure, and flow rates through all stages in steam turbines. Five various strategies, four thermodynamics and one economical, which quantify system operation, were defined and discussed as an optimization functions. Mathematical model of steam turbines calculates steam properties according to the formulation proposed by the International Association for the Properties of Water and Steam. Genetic algorithm GENOCOP was implemented as a solving engine for non–linear problem with handling constrains. Using formulated methodology, example solution for partially loaded system, composed of five steam turbines (30 input variables) with different characteristics, was obtained for five strategies. The genetic algorithm found multiple solutions (various input parameters sets) giving similar overall results. In real application it allows for appropriate scheduling of machine operation that would affect equable time load of every system compounds. Also based on these results three strategies where chosen as the most complex: the first thermodynamic law energy and exergy efficiency maximization and total equivalent energy minimization. These strategies can be successfully used in optimization of real cogeneration applications. - Highlights: • Genetic optimization model for a set of five various steam turbines was presented. • Four various thermodynamic optimization strategies were proposed and discussed. • Operational parameters (steam pressure, temperature, flow) influence was examined. • Genetic algorithm generated optimal solutions giving the best estimators values. • It has been found that similar energy effect can be obtained for various inputs

  7. QR codes: next level of social media.

    Science.gov (United States)

    Gottesman, Wesley; Baum, Neil

    2013-01-01

    The OR code, which is short for quick response code, system was invented in Japan for the auto industry. Its purpose was to track vehicles during manufacture; it was designed to allow high-speed component scanning. Now the scanning can be easily accomplished via cell phone, making the technology useful and within reach of your patients. There are numerous applications for OR codes in the contemporary medical practice. This article describes QR codes and how they might be applied for marketing and practice management.

  8. The Calculation of Flooding Level using CFX Code

    International Nuclear Information System (INIS)

    Oh, Seo Bin; Kim, Keon Yeop; Lee, Hyung Ho

    2015-01-01

    The plant design should consider internal flooding by postulated pipe ruptures, component failures, actuation of spray systems, and improper system alignment. The flooding causes failure of safety-related equipment and affects the integrity of the structure. The safety-related equipment should be installed above the flood level for protection against flooding effects. Conservative estimates of the flood level are important when a DBA occurs. The flooding level can be calculated simply applying Bernoulli's equation. However, in this study, a realistic calculation is performed with ANSYS CFX code. In calculation with CFX, air-core vortex phenomena, and turbulent flow can be simulated, which cannot be calculated analytically. The flooding level is evaluated by analytical calculation and CFX analysis for an assumed condition. The flood level is calculated as 0.71m and 1.1m analytically and with CFX simulation, respectively. Comparing the analytical calculation and simulation, they are similar, but the analytical calculation is not conservative. There are many factors reducing the drainage capacity such as air-core vortex, intake of air, and turbulent flow. Therefore, in case of flood level evaluation by analytical calculation, a sufficient safety margin should be considered

  9. Generation of Efficient High-Level Hardware Code from Dataflow Programs

    OpenAIRE

    Siret , Nicolas; Wipliez , Matthieu; Nezan , Jean François; Palumbo , Francesca

    2012-01-01

    High-level synthesis (HLS) aims at reducing the time-to-market by providing an automated design process that interprets and compiles high-level abstraction programs into hardware. However, HLS tools still face limitations regarding the performance of the generated code, due to the difficulties of compiling input imperative languages into efficient hardware code. Moreover the hardware code generated by the HLS tools is usually target-dependant and at a low level of abstraction (i.e. gate-level...

  10. SYMBOL LEVEL DECODING FOR DUO-BINARY TURBO CODES

    Directory of Open Access Journals (Sweden)

    Yogesh Beeharry

    2017-05-01

    Full Text Available This paper investigates the performance of three different symbol level decoding algorithms for Duo-Binary Turbo codes. Explicit details of the computations involved in the three decoding techniques, and a computational complexity analysis are given. Simulation results with different couple lengths, code-rates, and QPSK modulation reveal that the symbol level decoding with bit-level information outperforms the symbol level decoding by 0.1 dB on average in the error floor region. Moreover, a complexity analysis reveals that symbol level decoding with bit-level information reduces the decoding complexity by 19.6 % in terms of the total number of computations required for each half-iteration as compared to symbol level decoding.

  11. [Data coding in the Israeli healthcare system - do choices provide the answers to our system's needs?].

    Science.gov (United States)

    Zelingher, Julian; Ash, Nachman

    2013-05-01

    The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding

  12. ESCADRE and ICARE code systems

    International Nuclear Information System (INIS)

    Reocreux, M.; Gauvain, J.

    1992-01-01

    The French sever accident code development program is following two parallel approaches: the first one is dealing with ''integral codes'' which are designed for giving immediate engineer answers, the second one is following a more mechanistic way in order to have the capability of detailed analysis of experiments, in order to get a better understanding of the scaling problem and reach a better confidence in plant calculations. In the first approach a complete system has been developed and is being used for practical cases: this is the ESCADRE system. In the second approach, a set of codes dealing first with primary circuit is being developed: a mechanistic core degradation code, ICARE, has been issued and is being coupled with the advanced thermalhydraulic code CATHARE. Fission product codes have been also coupled to CATHARE. The ''integral'' ESCADRE system and the mechanistic ICARE and associated codes are described. Their main characteristics are reviewed and the status of their development and assessment given. Future studies are finally discussed. 36 refs, 4 figs, 1 tab

  13. The CORSYS neutronics code system

    International Nuclear Information System (INIS)

    Caner, M.; Krumbein, A.D.; Saphier, D.; Shapira, M.

    1994-01-01

    The purpose of this work is to assemble a code package for LWR core physics including coupled neutronics, burnup and thermal hydraulics. The CORSYS system is built around the cell code WIMS (for group microscopic cross section calculations) and 3-dimension diffusion code CITATION (for burnup and fuel management). We are implementing such a system on an IBM RS-6000 workstation. The code was rested with a simplified model of the Zion Unit 2 PWR. (authors). 6 refs., 8 figs., 1 tabs

  14. Development of a computer code for low-and intermediate-level radioactive waste disposal safety assessment

    International Nuclear Information System (INIS)

    Park, J. W.; Kim, C. L.; Lee, E. Y.; Lee, Y. M.; Kang, C. H.; Zhou, W.; Kozak, M. W.

    2002-01-01

    A safety assessment code, called SAGE (Safety Assessment Groundwater Evaluation), has been developed to describe post-closure radionuclide releases and potential radiological doses for low- and intermediate-level radioactive waste (LILW) disposal in an engineered vault facility in Korea. The conceptual model implemented in the code is focused on the release of radionuclide from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. The radionuclide transport equations are solved by spatially discretizing the disposal system into a series of compartments. Mass transfer between compartments is by diffusion/dispersion and advection. In all compartments, radionuclides are decayed either as a single-member chain or as multi-member chains. The biosphere is represented as a set of steady-state, radionuclide-specific pathway dose conversion factors that are multiplied by the appropriate release rate from the far field for each pathway. The code has the capability to treat input parameters either deterministically or probabilistically. Parameter input is achieved through a user-friendly Graphical User Interface. An application is presented, which is compared against safety assessment results from the other computer codes, to benchmark the reliability of system-level conceptual modeling of the code

  15. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jin; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures.

  16. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2004-01-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures

  17. The aminoacyl-tRNA synthetases had only a marginal role in the origin of the organization of the genetic code: Evidence in favor of the coevolution theory.

    Science.gov (United States)

    Di Giulio, Massimo

    2017-11-07

    The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal

  18. System Design Description for the TMAD Code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System

  19. Environmental remediation of high-level nuclear waste in geological repository. Modified computer code creates ultimate benchmark in natural systems

    International Nuclear Information System (INIS)

    Peter, Geoffrey J.

    2011-01-01

    Isolation of high-level nuclear waste in permanent geological repositories has been a major concern for over 30 years due to the migration of dissolved radio nuclides reaching the water table (10,000-year compliance period) as water moves through the repository and the surrounding area. Repositories based on mathematical models allow for long-term geological phenomena and involve many approximations; however, experimental verification of long-term processes is impossible. Countries must determine if geological disposal is adequate for permanent storage. Many countries have extensively studied different aspects of safely confining the highly radioactive waste in an underground repository based on the unique geological composition at their selected repository location. This paper discusses two computer codes developed by various countries to study the coupled thermal, mechanical, and chemical process in these environments, and the migration of radionuclide. Further, this paper presents the results of a case study of the Magma-hydrothermal (MH) computer code, modified by the author, applied to nuclear waste repository analysis. The MH code verified by simulating natural systems thus, creating the ultimate benchmark. This approach based on processes similar to those expected near waste repositories currently occurring in natural systems. (author)

  20. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  1. Development of Attendance Database System Using Bar-coded Student Card

    Directory of Open Access Journals (Sweden)

    Abdul Fadlil

    2008-04-01

    Full Text Available The calculation of the level of attendance is very important, because one indicator of a person's credibility can be seen from the level of attendance. For example, at a university, data about the level of attendance of a student in a lecture is very important as one of components in the assessment. The manual presence system is considered less effective. This research presents the draft of presence system using bar codes (barcodes as input data representing the attendance. The presence system is supported by three main components, those are a bar code found on the student card (KTM, a CCD barcode scanner series and a CD-108E computer. Management of attendance list using this system allows for optimization of functions of KTM. The presence system has been tested with several KTM through a variety of distances and positions of the barcode scanner barcode. The test results is obtained at ideal position for reading a barcode when a barcode scanner is at 2 cm from the object with 90 degree. At this position the level of accuracy reach 100%.

  2. Architectural and Algorithmic Requirements for a Next-Generation System Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    V.A. Mousseau

    2010-05-01

    This document presents high-level architectural and system requirements for a next-generation system analysis code (NGSAC) to support reactor safety decision-making by plant operators and others, especially in the context of light water reactor plant life extension. The capabilities of NGSAC will be different from those of current-generation codes, not only because computers have evolved significantly in the generations since the current paradigm was first implemented, but because the decision-making processes that need the support of next-generation codes are very different from the decision-making processes that drove the licensing and design of the current fleet of commercial nuclear power reactors. The implications of these newer decision-making processes for NGSAC requirements are discussed, and resulting top-level goals for the NGSAC are formulated. From these goals, the general architectural and system requirements for the NGSAC are derived.

  3. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  4. A new coding system for metabolic disorders demonstrates gaps in the international disease classifications ICD-10 and SNOMED-CT, which can be barriers to genotype-phenotype data sharing.

    Science.gov (United States)

    Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke

    2013-07-01

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.

  5. The help of simulation codes in designing waste assay systems using neutron measurement methods: Application to the alpha low level waste assay system PROMETHEE 6

    Energy Technology Data Exchange (ETDEWEB)

    Mariani, A.; Passard, C.; Jallu, F. E-mail: fanny.jallu@cea.fr; Toubon, H

    2003-11-01

    The design of a specific nuclear assay system for a dedicated application begins with a phase of development, which relies on information from the literature or on knowledge resulting from experience, and on specific experimental verifications. The latter ones may require experimental devices which can be restricting in terms of deadline, cost and safety. One way generally chosen to bypass these difficulties is to use simulation codes to study particular aspects. This paper deals with the potentialities offered by the simulation in the case of a passive-active neutron (PAN) assay system for alpha low level waste characterization; this system has been carried out at the Nuclear Measurements Development Laboratory of the French Atomic Energy Commission. Due to the high number of parameters to be taken into account for its development, this is a particularly sophisticated example. Since the PAN assay system, called PROMETHEE (prompt epithermal and thermal interrogation experiment), must have a detection efficiency of more than 20% and preserve a high level of modularity for various applications, an improved version has been studied using the MCNP4 (Monte Carlo N-Particle) transport code. Parameters such as the dimensions of the assay system, of the cavity and of the detection blocks, and the thicknesses of the nuclear materials of neutronic interest have been optimised. Therefore, the number of necessary experiments was reduced.

  6. The help of simulation codes in designing waste assay systems using neutron measurement methods: Application to the alpha low level waste assay system PROMETHEE 6

    International Nuclear Information System (INIS)

    Mariani, A.; Passard, C.; Jallu, F.; Toubon, H.

    2003-01-01

    The design of a specific nuclear assay system for a dedicated application begins with a phase of development, which relies on information from the literature or on knowledge resulting from experience, and on specific experimental verifications. The latter ones may require experimental devices which can be restricting in terms of deadline, cost and safety. One way generally chosen to bypass these difficulties is to use simulation codes to study particular aspects. This paper deals with the potentialities offered by the simulation in the case of a passive-active neutron (PAN) assay system for alpha low level waste characterization; this system has been carried out at the Nuclear Measurements Development Laboratory of the French Atomic Energy Commission. Due to the high number of parameters to be taken into account for its development, this is a particularly sophisticated example. Since the PAN assay system, called PROMETHEE (prompt epithermal and thermal interrogation experiment), must have a detection efficiency of more than 20% and preserve a high level of modularity for various applications, an improved version has been studied using the MCNP4 (Monte Carlo N-Particle) transport code. Parameters such as the dimensions of the assay system, of the cavity and of the detection blocks, and the thicknesses of the nuclear materials of neutronic interest have been optimised. Therefore, the number of necessary experiments was reduced

  7. ARC Code TI: Optimal Alarm System Design and Implementation

    Data.gov (United States)

    National Aeronautics and Space Administration — An optimal alarm system can robustly predict a level-crossing event that is specified over a fixed prediction horizon. The code contained in this packages provides...

  8. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  9. Computer codes for level 1 probabilistic safety assessment

    International Nuclear Information System (INIS)

    1990-06-01

    Probabilistic Safety Assessment (PSA) entails several laborious tasks suitable for computer codes assistance. This guide identifies these tasks, presents guidelines for selecting and utilizing computer codes in the conduct of the PSA tasks and for the use of PSA results in safety management and provides information on available codes suggested or applied in performing PSA in nuclear power plants. The guidance is intended for use by nuclear power plant system engineers, safety and operating personnel, and regulators. Large efforts are made today to provide PC-based software systems and PSA processed information in a way to enable their use as a safety management tool by the nuclear power plant overall management. Guidelines on the characteristics of software needed for management to prepare a software that meets their specific needs are also provided. Most of these computer codes are also applicable for PSA of other industrial facilities. The scope of this document is limited to computer codes used for the treatment of internal events. It does not address other codes available mainly for the analysis of external events (e.g. seismic analysis) flood and fire analysis. Codes discussed in the document are those used for probabilistic rather than for phenomenological modelling. It should be also appreciated that these guidelines are not intended to lead the user to selection of one specific code. They provide simply criteria for the selection. Refs and tabs

  10. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  11. Dynamics of genetic variation at gliadin-coding loci in bread wheat cultivars developed in small grains research center (Kragujevac during last 35 years

    Directory of Open Access Journals (Sweden)

    Novosljska-Dragovič Aleksandra

    2005-01-01

    Full Text Available Multiple alleles of gliadin-coding loci are well-known genetic markers of common wheat genotypes. Based on analysis of gliadin patterns in common wheat cultivars developed at the Small Grains Research Center in Kragujevac dynamics of genetic variability at gliadin-coding loci has been surveyed for the period of 35 years. It was shown that long-term breeding of the wheat cultivars involved gradual replacement of ancient alleles for those widely spread in some regions in the world, which belong to well-known cultivars-donor of some important traits. Developing cultivars whose pedigree involved much new foreign genetic material has increased genetic diversity as well as has changed frequency of alleles of gliadin-coding loci. So we can conclude that the genetic profile of modern Serbian cultivars has changed considerably. Genetic formula of gliadin was made for each the cultivar studied. The most frequent alleles of gliadin-coding loci among modern cultivars should be of great interest of breeders because these alleles are probably linked with genes that confer advantage to their carriers at present.

  12. Foundations of genetic algorithms 1991

    CERN Document Server

    1991-01-01

    Foundations of Genetic Algorithms 1991 (FOGA 1) discusses the theoretical foundations of genetic algorithms (GA) and classifier systems.This book compiles research papers on selection and convergence, coding and representation, problem hardness, deception, classifier system design, variation and recombination, parallelization, and population divergence. Other topics include the non-uniform Walsh-schema transform; spurious correlations and premature convergence in genetic algorithms; and variable default hierarchy separation in a classifier system. The grammar-based genetic algorithm; condition

  13. Next generation Zero-Code control system UI

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Developing ergonomic user interfaces for control systems is challenging, especially during machine upgrade and commissioning where several small changes may suddenly be required. Zero-code systems, such as *Inspector*, provide agile features for creating and maintaining control system interfaces. More so, these next generation Zero-code systems bring simplicity and uniformity and brake the boundaries between Users and Developers. In this talk we present *Inspector*, a CERN made Zero-code application development system, and we introduce the major differences and advantages of using Zero-code control systems to develop operational UI.

  14. A mean field theory of coded CDMA systems

    International Nuclear Information System (INIS)

    Yano, Toru; Tanaka, Toshiyuki; Saad, David

    2008-01-01

    We present a mean field theory of code-division multiple-access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean-field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems

  15. A mean field theory of coded CDMA systems

    Energy Technology Data Exchange (ETDEWEB)

    Yano, Toru [Graduate School of Science and Technology, Keio University, Hiyoshi, Kohoku-ku, Yokohama-shi, Kanagawa 223-8522 (Japan); Tanaka, Toshiyuki [Graduate School of Informatics, Kyoto University, Yoshida Hon-machi, Sakyo-ku, Kyoto-shi, Kyoto 606-8501 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)], E-mail: yano@thx.appi.keio.ac.jp

    2008-08-15

    We present a mean field theory of code-division multiple-access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean-field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.

  16. SASSYS LMFBR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.

    1982-01-01

    The SASSYS code provides detailed steady-state and transient thermal-hydraulic analyses of the reactor core, inlet and outlet coolant plenums, primary and intermediate heat-removal systems, steam generators, and emergency shut-down heat removal systems in liquid-metal-cooled fast-breeder reactors (LMFBRs). The main purpose of the code is to analyze the consequences of failures in the shut-down heat-removal system and to determine whether this system can perform its mission adequately even with some of its components inoperable. The code is not plant-specific. It is intended for use with any LMFBR, using either a loop or a pool design, a once-through steam generator or an evaporator-superheater combination, and either a homogeneous core or a heterogeneous core with internal-blanket assemblies

  17. Basic concept of common reactor physics code systems. Final report of working party on common reactor physics code systems (CCS)

    International Nuclear Information System (INIS)

    2004-03-01

    A working party was organized for two years (2001-2002) on common reactor physics code systems under the Research Committee on Reactor Physics of JAERI. This final report is compilation of activity of the working party on common reactor physics code systems during two years. Objectives of the working party is to clarify basic concept of common reactor physics code systems to improve convenience of reactor physics code systems for reactor physics researchers in Japan on their various field of research and development activities. We have held four meetings during 2 years, investigated status of reactor physics code systems and innovative software technologies, and discussed basic concept of common reactor physics code systems. (author)

  18. The octopus burnup and criticality code system

    Energy Technology Data Exchange (ETDEWEB)

    Kloosterman, J.L.; Kuijper, J.C. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Leege, P.F.A. de

    1996-09-01

    The OCTOPUS burnup and criticality code system is described. This system links the spectrum codes from the SCALE4.1, WIMS7 and MCNP4A packages to the ORIGEN-S and FISPACT4.2 fuel depletion and activation codes, which enables us to perform very accurate burnup calculations in complicated three-dimensional geometries. The data used by all codes are consistently based on the JEF2.2 evaluated nuclear data file. Some special features of OCTOPUS not available in other codes are described, as well as the validation of the system. (author)

  19. The OCTOPUS burnup and criticality code system

    Energy Technology Data Exchange (ETDEWEB)

    Kloosterman, J.L. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Kuijper, J.C. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Leege, P.F.A. de [Technische Univ. Delft (Netherlands). Interfacultair Reactor Inst.

    1996-06-01

    The OCTOPUS burnup and criticality code system is described. This system links the spectrum codes from the SCALE4.1, WIMS7 and MCNP4A packages to the ORIGEN-S and FISPACT4.2 fuel depletion and activation codes, which enables us to perform very accurate burnup calculations in complicated three-dimensional goemetries. The data used by all codes are consistently based on the JEF2.2 evaluated nuclear data file. Some special features of OCTOPUS not available in other codes are described, as well as the validation of the system. (orig.).

  20. The octopus burnup and criticality code system

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Kuijper, J.C.; Leege, P.F.A. de.

    1996-01-01

    The OCTOPUS burnup and criticality code system is described. This system links the spectrum codes from the SCALE4.1, WIMS7 and MCNP4A packages to the ORIGEN-S and FISPACT4.2 fuel depletion and activation codes, which enables us to perform very accurate burnup calculations in complicated three-dimensional geometries. The data used by all codes are consistently based on the JEF2.2 evaluated nuclear data file. Some special features of OCTOPUS not available in other codes are described, as well as the validation of the system. (author)

  1. The OCTOPUS burnup and criticality code system

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Kuijper, J.C.; Leege, P.F.A. de

    1996-06-01

    The OCTOPUS burnup and criticality code system is described. This system links the spectrum codes from the SCALE4.1, WIMS7 and MCNP4A packages to the ORIGEN-S and FISPACT4.2 fuel depletion and activation codes, which enables us to perform very accurate burnup calculations in complicated three-dimensional goemetries. The data used by all codes are consistently based on the JEF2.2 evaluated nuclear data file. Some special features of OCTOPUS not available in other codes are described, as well as the validation of the system. (orig.)

  2. Research on coding and decoding method for digital levels

    Energy Technology Data Exchange (ETDEWEB)

    Tu Lifen; Zhong Sidong

    2011-01-20

    A new coding and decoding method for digital levels is proposed. It is based on an area-array CCD sensor and adopts mixed coding technology. By taking advantage of redundant information in a digital image signal, the contradiction that the field of view and image resolution restrict each other in a digital level measurement is overcome, and the geodetic leveling becomes easier. The experimental results demonstrate that the uncertainty of measurement is 1mm when the measuring range is between 2m and 100m, which can meet practical needs.

  3. Research on coding and decoding method for digital levels.

    Science.gov (United States)

    Tu, Li-fen; Zhong, Si-dong

    2011-01-20

    A new coding and decoding method for digital levels is proposed. It is based on an area-array CCD sensor and adopts mixed coding technology. By taking advantage of redundant information in a digital image signal, the contradiction that the field of view and image resolution restrict each other in a digital level measurement is overcome, and the geodetic leveling becomes easier. The experimental results demonstrate that the uncertainty of measurement is 1 mm when the measuring range is between 2 m and 100 m, which can meet practical needs.

  4. High dynamic range coding imaging system

    Science.gov (United States)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  5. Psacoin level S intercomparison: An International code intercomparison exercise on a hypothetical safety assessment case study for radioactive waste disposal systems

    International Nuclear Information System (INIS)

    1993-06-01

    This report documents the Level S exercise of the Probabilistic System Assessment Group (PSAG). Level S is the fifth in a series of Probabilistic Code Intercomparison (PSACOIN) exercises designed to contribute to the verification of probabilistic codes and methodologies that may be used in assessing the safety of radioactive waste disposal systems and concepts. The focus of the Level S exercise lies on sensitivity analysis. Given a common data set of model output and input values the participants were asked to identify both the underlying model's most important parameters (deterministic sensitivity analysis) and the link between the distributions of the input and output values (distribution sensitivity analysis). Agreement was generally found where it was expected and the exercise has achieved its objectives in acting as a focus for testing and discussing sensitivity analysis issues. Among the outstanding issues that have been identified are: (i) that techniques for distribution sensitivity analysis are needed that avoid the problem of statistical noise; (ii) that further investigations are warranted on the most appropriate way of handling large numbers of effectively zero results generated by Monte Carlo sampling; and (iii) that methods need to be developed for demonstrating that the results of sensitivity analysis are indeed correct

  6. Adaptive Wavelet Coding Applied in a Wireless Control System.

    Science.gov (United States)

    Gama, Felipe O S; Silveira, Luiz F Q; Salazar, Andrés O

    2017-12-13

    Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER) versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  7. Adaptive Wavelet Coding Applied in a Wireless Control System

    Directory of Open Access Journals (Sweden)

    Felipe O. S. Gama

    2017-12-01

    Full Text Available Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  8. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  9. Status of reactor core design code system in COSINE code package

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.; Yu, H.; Liu, Z., E-mail: yuhui@snptc.com.cn [State Nuclear Power Software Development Center, SNPTC, National Energy Key Laboratory of Nuclear Power Software (NEKLS), Beijiing (China)

    2014-07-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  10. Status of reactor core design code system in COSINE code package

    International Nuclear Information System (INIS)

    Chen, Y.; Yu, H.; Liu, Z.

    2014-01-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  11. Partial Safety Factors and Target Reliability Level in Danish Structural Codes

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hansen, J. O.; Nielsen, T. A.

    2001-01-01

    The partial safety factors in the newly revised Danish structural codes have been derived using a reliability-based calibration. The calibrated partial safety factors result in the same average reliability level as in the previous codes, but a much more uniform reliability level has been obtained....... The paper describes the code format, the stochastic models and the resulting optimised partial safety factors....

  12. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  13. Unassigned Codons, Nonsense Suppression, and Anticodon Modifications in the Evolution of the Genetic Code

    NARCIS (Netherlands)

    P.T.S. van der Gulik (Peter); W.D. Hoff (Wouter)

    2011-01-01

    htmlabstractThe origin of the genetic code is a central open problem regarding the early evolution of life. Here, we consider two undeveloped but important aspects of possible scenarios for the evolutionary pathway of the translation machinery: the role of unassigned codons in early stages

  14. Power Optimization of Wireless Media Systems With Space-Time Block Codes

    OpenAIRE

    Yousefi'zadeh, Homayoun; Jafarkhani, Hamid; Moshfeghi, Mehran

    2004-01-01

    We present analytical and numerical solutions to the problem of power control in wireless media systems with multiple antennas. We formulate a set of optimization problems aimed at minimizing total power consumption of wireless media systems subject to a given level of QoS and an available bit rate. Our formulation takes in to consideration the power consumption related to source coding, channel coding, and transmission of multiple-transmit antennas. In our study, we consider Gauss-Markov and...

  15. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  16. Research on the improvement of nuclear safety -Development of computing code system for level 3 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong Tae; Kim, Dong Ha; Park, Won Seok; Hwang, Mi Jeong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated. These results will give a physical insight in the development of a new dispersion model. A wind tunnel experiment with bell shaped hill model was made in order to develop a new dispersion model. And an improved dispersion model was developed based on the concentration distribution data obtained from the wind tunnel experiment. This model will be added as an option to the atmospheric dispersion code. A stand-alone atmospheric code using MS Visual Basic programming language which runs at the Windows environment of a PC was developed. A user can easily select a necessary data file and type input data by clicking menus, and can select calculation options such building wake, plume rise etc., if necessary. And a user can easily understand the meaning of concentration distribution on the map around the plant site as well as output files. Also the methodologies for the estimation of radiation exposure and for the calculation of risks was established. These methodologies will be used for the development of modules for the radiation exposure and risks respectively. These modules will be developed independently and finally will be combined to the atmospheric dispersion code in order to develop a level 3 PSA code. 30 tabs., 56 figs., refs. (Author).

  17. Research on the improvement of nuclear safety -Development of computing code system for level 3 PSA

    International Nuclear Information System (INIS)

    Jeong, Jong Tae; Kim, Dong Ha; Park, Won Seok; Hwang, Mi Jeong

    1995-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated. These results will give a physical insight in the development of a new dispersion model. A wind tunnel experiment with bell shaped hill model was made in order to develop a new dispersion model. And an improved dispersion model was developed based on the concentration distribution data obtained from the wind tunnel experiment. This model will be added as an option to the atmospheric dispersion code. A stand-alone atmospheric code using MS Visual Basic programming language which runs at the Windows environment of a PC was developed. A user can easily select a necessary data file and type input data by clicking menus, and can select calculation options such building wake, plume rise etc., if necessary. And a user can easily understand the meaning of concentration distribution on the map around the plant site as well as output files. Also the methodologies for the estimation of radiation exposure and for the calculation of risks was established. These methodologies will be used for the development of modules for the radiation exposure and risks respectively. These modules will be developed independently and finally will be combined to the atmospheric dispersion code in order to develop a level 3 PSA code. 30 tabs., 56 figs., refs. (Author)

  18. Variable weight Khazani-Syed code using hybrid fixed-dynamic technique for optical code division multiple access system

    Science.gov (United States)

    Anas, Siti Barirah Ahmad; Seyedzadeh, Saleh; Mokhtar, Makhfudzah; Sahbudin, Ratna Kalos Zakiah

    2016-10-01

    Future Internet consists of a wide spectrum of applications with different bit rates and quality of service (QoS) requirements. Prioritizing the services is essential to ensure that the delivery of information is at its best. Existing technologies have demonstrated how service differentiation techniques can be implemented in optical networks using data link and network layer operations. However, a physical layer approach can further improve system performance at a prescribed received signal quality by applying control at the bit level. This paper proposes a coding algorithm to support optical domain service differentiation using spectral amplitude coding techniques within an optical code division multiple access (OCDMA) scenario. A particular user or service has a varying weight applied to obtain the desired signal quality. The properties of the new code are compared with other OCDMA codes proposed for service differentiation. In addition, a mathematical model is developed for performance evaluation of the proposed code using two different detection techniques, namely direct decoding and complementary subtraction.

  19. A novel nuclear genetic code alteration in yeasts and the evolution of codon reassignment in eukaryotes.

    Science.gov (United States)

    Mühlhausen, Stefanie; Findeisen, Peggy; Plessmann, Uwe; Urlaub, Henning; Kollmar, Martin

    2016-07-01

    The genetic code is the cellular translation table for the conversion of nucleotide sequences into amino acid sequences. Changes to the meaning of sense codons would introduce errors into almost every translated message and are expected to be highly detrimental. However, reassignment of single or multiple codons in mitochondria and nuclear genomes, although extremely rare, demonstrates that the code can evolve. Several models for the mechanism of alteration of nuclear genetic codes have been proposed (including "codon capture," "genome streamlining," and "ambiguous intermediate" theories), but with little resolution. Here, we report a novel sense codon reassignment in Pachysolen tannophilus, a yeast related to the Pichiaceae. By generating proteomics data and using tRNA sequence comparisons, we show that Pachysolen translates CUG codons as alanine and not as the more usual leucine. The Pachysolen tRNACAG is an anticodon-mutated tRNA(Ala) containing all major alanine tRNA recognition sites. The polyphyly of the CUG-decoding tRNAs in yeasts is best explained by a tRNA loss driven codon reassignment mechanism. Loss of the CUG-tRNA in the ancient yeast is followed by gradual decrease of respective codons and subsequent codon capture by tRNAs whose anticodon is not part of the aminoacyl-tRNA synthetase recognition region. Our hypothesis applies to all nuclear genetic code alterations and provides several testable predictions. We anticipate more codon reassignments to be uncovered in existing and upcoming genome projects. © 2016 Mühlhausen et al.; Published by Cold Spring Harbor Laboratory Press.

  20. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  1. Genetic Variation in the Natriuretic Peptide System, Circulating Natriuretic Peptide Levels, and Blood Pressure

    DEFF Research Database (Denmark)

    Jeppesen, Jørgen L; Nielsen, Søren J; Torp-Pedersen, Christian

    2012-01-01

    -h ambulatory BP measurements (ABPMs) will influence the effect of NP gene variations on BP levels.MethodsWe used rs632793 at the NPPB (NP precursor B) locus to investigate the relationship between genetically determined serum N-terminal pro-brain NP (NT-proBNP) concentrations and BP levels...... determined by both 24-h ABPMs and OBPMs in a population consisting of 1,397 generally healthy individuals taking no BP-lowering drugs.Resultsrs632793 was significantly correlated with serum Nt-proBNP levels (r = 0.10, P = 0.0003), and participants with the A:A genotype had lower serum Nt-proBNP levels than......). Office BP decreased across the genotypes from A:A to G:G, but the differences did not reach statistical significance (P = 0.12).ConclusionsThis study suggests that 24-h ABPMs is a better method than OBPMs to detect significant differences in BP levels related to genetic variance and provides further...

  2. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  3. Variable-length code construction for incoherent optical CDMA systems

    Science.gov (United States)

    Lin, Jen-Yung; Jhou, Jhih-Syue; Wen, Jyh-Horng

    2007-04-01

    The purpose of this study is to investigate the multirate transmission in fiber-optic code-division multiple-access (CDMA) networks. In this article, we propose a variable-length code construction for any existing optical orthogonal code to implement a multirate optical CDMA system (called as the multirate code system). For comparison, a multirate system where the lower-rate user sends each symbol twice is implemented and is called as the repeat code system. The repetition as an error-detection code in an ARQ scheme in the repeat code system is also investigated. Moreover, a parallel approach for the optical CDMA systems, which is proposed by Marić et al., is also compared with other systems proposed in this study. Theoretical analysis shows that the bit error probability of the proposed multirate code system is smaller than other systems, especially when the number of lower-rate users is large. Moreover, if there is at least one lower-rate user in the system, the multirate code system accommodates more users than other systems when the error probability of system is set below 10 -9.

  4. Selection of a computer code for Hanford low-level waste engineered-system performance assessment. Revision 1

    International Nuclear Information System (INIS)

    McGrail, B.P.; Bacon, D.H.

    1998-02-01

    Planned performance assessments for the proposed disposal of low-activity waste (LAW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. The available computer codes with suitable capabilities at the time Revision 0 of this document was prepared were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical processes expected to affect LAW glass corrosion and the mobility of radionuclides. This analysis was repeated in this report but updated to include additional processes that have been found to be important since Revision 0 was issued and to include additional codes that have been released. The highest ranked computer code was found to be the STORM code developed at PNNL for the US Department of Energy for evaluation of arid land disposal sites

  5. A High-Level Petri Net Framework for Genetic Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Banks Richard

    2007-12-01

    Full Text Available To understand the function of genetic regulatory networks in the development of cellular systems, we must not only realise the individual network entities, but also the manner by which they interact. Multi-valued networks are a promising qualitative approach for modelling such genetic regulatory networks, however, at present they have limited formal analysis techniques and tools. We present a flexible formal framework for modelling and analysing multi-valued genetic regulatory networks using high-level Petri nets and logic minimization techniques. We demonstrate our approach with a detailed case study in which part of the genetic regulatory network responsible for the carbon starvation stress response in Escherichia coli is modelled and analysed. We then compare and contrast this multivalued model to a corresponding Boolean model and consider their formal relationship.

  6. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    International Nuclear Information System (INIS)

    Mori, Takamasa; Nakagawa, Masayuki; Kaneko, Kunio.

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author)

  7. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    Energy Technology Data Exchange (ETDEWEB)

    Mori, Takamasa; Nakagawa, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author).

  8. Lossless, Near-Lossless, and Refinement Coding of Bi-level Images

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren Otto

    1999-01-01

    We present general and unified algorithms for lossy/lossless coding of bi-level images. The compression is realized by applying arithmetic coding to conditional probabilities. As in the current JBIG standard the conditioning may be specified by a template.For better compression, the more general...... to the specialized soft pattern matching techniques which work better for text. Template based refinement coding is applied for lossy-to-lossless refinement. Introducing only a small amount of loss in halftoned test images, compression is increased by up to a factor of four compared with JBIG. Lossy, lossless......, and refinement decoding speed and lossless encoding speed are less than a factor of two slower than JBIG. The (de)coding method is proposed as part of JBIG2, an emerging international standard for lossless/lossy compression of bi-level images....

  9. Concatenated coding system with iterated sequential inner decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis; Paaske, Erik

    1995-01-01

    We describe a concatenated coding system with iterated sequential inner decoding. The system uses convolutional codes of very long constraint length and operates on iterations between an inner Fano decoder and an outer Reed-Solomon decoder......We describe a concatenated coding system with iterated sequential inner decoding. The system uses convolutional codes of very long constraint length and operates on iterations between an inner Fano decoder and an outer Reed-Solomon decoder...

  10. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  11. Research on the improvement of nuclear safety -Improvement of level 1 PSA computer code package-

    International Nuclear Information System (INIS)

    Park, Chang Kyoo; Kim, Tae Woon; Kim, Kil Yoo; Han, Sang Hoon; Jung, Won Dae; Jang, Seung Chul; Yang, Joon Un; Choi, Yung; Sung, Tae Yong; Son, Yung Suk; Park, Won Suk; Jung, Kwang Sub; Kang Dae Il; Park, Jin Heui; Hwang, Mi Jung; Hah, Jae Joo

    1995-07-01

    This year is the third year of the Government-sponsored mid- and long-term nuclear power technology development project. The scope of this sub project titled on 'The improvement of level-1 PSA computer codes' is divided into three main activities : (1) Methodology development on the underdeveloped fields such as risk assessment technology for plant shutdown and low power situations, (2) Computer code package development for level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in this area of shutdown risk assessment technology development, plant outage experiences of domestic plants are reviewed and plant operating states (POS) are decided. A sample core damage frequency is estimated for over draining event in RCS low water inventory i.e. mid-loop operation. Human reliability analysis and thermal hydraulic support analysis are identified to be needed to reduce uncertainty. Two design improvement alternatives are evaluated using PSA technique for mid-loop operation situation: one is use of containment spray system as backup of shutdown cooling system and the other is installation of two independent level indication system. Procedure change is identified more preferable option to hardware modification in the core damage frequency point of view. Next, level-1 PSA code KIRAP is converted to PC-windows environment. For the improvement of efficiency in performing PSA, the fast cutest generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. 48 figs, 15 tabs, 59 refs. (Author)

  12. Research on the improvement of nuclear safety -Improvement of level 1 PSA computer code package-

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Kyoo; Kim, Tae Woon; Kim, Kil Yoo; Han, Sang Hoon; Jung, Won Dae; Jang, Seung Chul; Yang, Joon Un; Choi, Yung; Sung, Tae Yong; Son, Yung Suk; Park, Won Suk; Jung, Kwang Sub; Kang Dae Il; Park, Jin Heui; Hwang, Mi Jung; Hah, Jae Joo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This year is the third year of the Government-sponsored mid- and long-term nuclear power technology development project. The scope of this sub project titled on `The improvement of level-1 PSA computer codes` is divided into three main activities : (1) Methodology development on the underdeveloped fields such as risk assessment technology for plant shutdown and low power situations, (2) Computer code package development for level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in this area of shutdown risk assessment technology development, plant outage experiences of domestic plants are reviewed and plant operating states (POS) are decided. A sample core damage frequency is estimated for over draining event in RCS low water inventory i.e. mid-loop operation. Human reliability analysis and thermal hydraulic support analysis are identified to be needed to reduce uncertainty. Two design improvement alternatives are evaluated using PSA technique for mid-loop operation situation: one is use of containment spray system as backup of shutdown cooling system and the other is installation of two independent level indication system. Procedure change is identified more preferable option to hardware modification in the core damage frequency point of view. Next, level-1 PSA code KIRAP is converted to PC-windows environment. For the improvement of efficiency in performing PSA, the fast cutest generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. 48 figs, 15 tabs, 59 refs. (Author).

  13. Expansion of the CHR bone code system

    International Nuclear Information System (INIS)

    Farnham, J.E.; Schlenker, R.A.

    1976-01-01

    This report describes the coding system used in the Center for Human Radiobiology (CHR) to identify individual bones and portions of bones of a complete skeletal system. It includes illustrations of various bones and bone segments with their respective code numbers. Codes are also presented for bone groups and for nonbone materials

  14. MARS code manual volume I: code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Yoon, Churl

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This theory manual provides a complete list of overall information of code structure and major function of MARS including code architecture, hydrodynamic model, heat structure, trip / control system and point reactor kinetics model. Therefore, this report would be very useful for the code users. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  15. Unfolding neutron spectra obtained from BS–TLD system using genetic algorithm

    International Nuclear Information System (INIS)

    Santos, J.A.L.; Silva, E.R.; Ferreira, T.A.E; Vilela, E.C.

    2012-01-01

    Due to the variability of neutron spectrum within the same environment, it is essential that the spectral distribution as a function of energy should be characterized. The precise information allows radiological quantities establishment related to that spectrum, but it is necessary that a spectrometric system covers a large interval of energy and an unfolding process is appropriate. This paper proposes use of a technique of Artificial Intelligence (AI) called genetic algorithm (GA), which uses bio-inspired mathematical models with the implementation of a specific matrix to unfolding data obtained from a combination of TLDs embedded in a BS system to characterize the neutron spectrum as a function of energy. The results obtained with this method were in accordance with reference spectra, thus enabling this technique to unfold neutron spectra with the BS–TLD system. - Highlights: ► The unfolding code used the artificial intelligence technique called genetic algorithms. ► A response matrix specific to the unfolding data obtained with the BS–TLD system is used by the AGLN. ► The observed results demonstrate the potential use of genetic algorithms in solving complex nuclear problems.

  16. Interrelations of codes in human semiotic systems.

    OpenAIRE

    Somov, Georgij

    2016-01-01

    Codes can be viewed as mechanisms that enable relations of signs and their components, i.e., semiosis is actualized. The combinations of these relations produce new relations as new codes are building over other codes. Structures appear in the mechanisms of codes. Hence, codes can be described as transformations of structures from some material systems into others. Structures belong to different carriers, but exist in codes in their "pure" form. Building of codes over other codes fosters t...

  17. Transmission over UWB channels with OFDM system using LDPC coding

    Science.gov (United States)

    Dziwoki, Grzegorz; Kucharczyk, Marcin; Sulek, Wojciech

    2009-06-01

    Hostile wireless environment requires use of sophisticated signal processing methods. The paper concerns on Ultra Wideband (UWB) transmission over Personal Area Networks (PAN) including MB-OFDM specification of physical layer. In presented work the transmission system with OFDM modulation was connected with LDPC encoder/decoder. Additionally the frame and bit error rate (FER and BER) of the system was decreased using results from the LDPC decoder in a kind of turbo equalization algorithm for better channel estimation. Computational block using evolutionary strategy, from genetic algorithms family, was also used in presented system. It was placed after SPA (Sum-Product Algorithm) decoder and is conditionally turned on in the decoding process. The result is increased effectiveness of the whole system, especially lower FER. The system was tested with two types of LDPC codes, depending on type of parity check matrices: randomly generated and constructed deterministically, optimized for practical decoder architecture implemented in the FPGA device.

  18. Human growth hormone-related latrogenic Creutzfeldt-Jakob disease: Search for a genetic susceptibility by analysis of the PRNP coding region

    Energy Technology Data Exchange (ETDEWEB)

    Jaegly, A.; Boussin, F.; Deslys, J.P. [CEA/CRSSA/DSV/DPTE, Fontenay-aux-Roses (France)] [and others

    1995-05-20

    The human PRNP gene encoding PrP is located on chromosome 20 and consists of two exons and a single intron. The open reading frame is entirely fitted into the second exon. Genetic studies indicate that all of the familial and several sporadic forms of TSSEs are associated with mutations in the PRNP 759-bp coding region. Moreover, homozygosity at codon 129, a locus harboring a polymorphism among the general population, was proposed as a genetic susceptibility marker for both sporadic and iatrogenic CJD. To assess whether additional genetic predisposition markers exist in the PRNP gene, the authors sequenced the PRNP coding region of 17 of the 32 French patients who developed a hGH-related CJD.

  19. Indoor high precision three-dimensional positioning system based on visible light communication using modified genetic algorithm

    Science.gov (United States)

    Chen, Hao; Guan, Weipeng; Li, Simin; Wu, Yuxiang

    2018-04-01

    To improve the precision of indoor positioning and actualize three-dimensional positioning, a reversed indoor positioning system based on visible light communication (VLC) using genetic algorithm (GA) is proposed. In order to solve the problem of interference between signal sources, CDMA modulation is used. Each light-emitting diode (LED) in the system broadcasts a unique identity (ID) code using CDMA modulation. Receiver receives mixed signal from every LED reference point, by the orthogonality of spreading code in CDMA modulation, ID information and intensity attenuation information from every LED can be obtained. According to positioning principle of received signal strength (RSS), the coordinate of the receiver can be determined. Due to system noise and imperfection of device utilized in the system, distance between receiver and transmitters will deviate from the real value resulting in positioning error. By introducing error correction factors to global parallel search of genetic algorithm, coordinates of the receiver in three-dimensional space can be determined precisely. Both simulation results and experimental results show that in practical application scenarios, the proposed positioning system can realize high precision positioning service.

  20. Plotting system for the MINCS code

    International Nuclear Information System (INIS)

    Watanabe, Tadashi

    1990-08-01

    The plotting system for the MINCS code is described. The transient two-phase flow analysis code MINCS has been developed to provide a computational tool for analysing various two-phase flow phenomena in one-dimensional ducts. Two plotting systems, namely the SPLPLOT system and the SDPLOT system, can be used as the plotting functions. The SPLPLOT system is used for plotting time transients of variables, while the SDPLOT system is for spatial distributions. The SPLPLOT system is based on the SPLPACK system, which is used as a general tool for plotting results of transient analysis codes or experiments. The SDPLOT is based on the GPLP program, which is also regarded as one of the general plotting programs. In the SPLPLOT and the SDPLOT systems, the standardized data format called the SPL format is used in reading data to be plotted. The output data format of MINCS is translated into the SPL format by using the conversion system called the MINTOSPL system. In this report, how to use the plotting functions is described. (author)

  1. SRAC2006: A comprehensive neutronics calculation code system

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Kugo, Teruhiko; Kaneko, Kunio; Tsuchihashi, Keichiro

    2007-02-01

    The SRAC is a code system applicable to neutronics analysis of a variety of reactor types. Since the publication of the second version of the users manual (JAERI-1302) in 1986 for the SRAC system, a number of additions and modifications to the functions and the library data have been made to establish a comprehensive neutronics code system. The current system includes major neutron data libraries (JENDL-3.3, JENDL-3.2, ENDF/B-VII, ENDF/B-VI.8, JEFF-3.1, JEF-2.2, etc.), and integrates five elementary codes for neutron transport and diffusion calculation; PIJ based on the collision probability method applicable to 16 kind of lattice models, S N transport codes ANISN(1D) and TWOTRN(2D), diffusion codes TUD(1D) and CITATION(multi-D). The system also includes an auxiliary code COREBN for multi-dimensional core burn-up calculation. (author)

  2. Noncoherent Spectral Optical CDMA System Using 1D Active Weight Two-Code Keying Codes

    Directory of Open Access Journals (Sweden)

    Bih-Chyun Yeh

    2016-01-01

    Full Text Available We propose a new family of one-dimensional (1D active weight two-code keying (TCK in spectral amplitude coding (SAC optical code division multiple access (OCDMA networks. We use encoding and decoding transfer functions to operate the 1D active weight TCK. The proposed structure includes an optical line terminal (OLT and optical network units (ONUs to produce the encoding and decoding codes of the proposed OLT and ONUs, respectively. The proposed ONU uses the modified cross-correlation to remove interferences from other simultaneous users, that is, the multiuser interference (MUI. When the phase-induced intensity noise (PIIN is the most important noise, the modified cross-correlation suppresses the PIIN. In the numerical results, we find that the bit error rate (BER for the proposed system using the 1D active weight TCK codes outperforms that for two other systems using the 1D M-Seq codes and 1D balanced incomplete block design (BIBD codes. The effective source power for the proposed system can achieve −10 dBm, which has less power than that for the other systems.

  3. Channel coding in the space station data system network

    Science.gov (United States)

    Healy, T.

    1982-01-01

    A detailed discussion of the use of channel coding for error correction, privacy/secrecy, channel separation, and synchronization is presented. Channel coding, in one form or another, is an established and common element in data systems. No analysis and design of a major new system would fail to consider ways in which channel coding could make the system more effective. The presence of channel coding on TDRS, Shuttle, the Advanced Communication Technology Satellite Program system, the JSC-proposed Space Operations Center, and the proposed 30/20 GHz Satellite Communication System strongly support the requirement for the utilization of coding for the communications channel. The designers of the space station data system have to consider the use of channel coding.

  4. Code system BCG for gamma-ray skyshine calculation

    International Nuclear Information System (INIS)

    Ryufuku, Hiroshi; Numakunai, Takao; Miyasaka, Shun-ichi; Minami, Kazuyoshi.

    1979-03-01

    A code system BCG has been developed for calculating conveniently and efficiently gamma-ray skyshine doses using the transport calculation codes ANISN and DOT and the point-kernel calculation codes G-33 and SPAN. To simplify the input forms to the system, the forms for these codes are unified, twelve geometric patterns are introduced to give material regions, and standard data are available as a library. To treat complex arrangements of source and shield, it is further possible to use successively the code such that the results from one code may be used as input data to the same or other code. (author)

  5. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  6. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  7. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  8. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    Science.gov (United States)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  9. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  10. On Analyzing LDPC Codes over Multiantenna MC-CDMA System

    Directory of Open Access Journals (Sweden)

    S. Suresh Kumar

    2014-01-01

    Full Text Available Multiantenna multicarrier code-division multiple access (MC-CDMA technique has been attracting much attention for designing future broadband wireless systems. In addition, low-density parity-check (LDPC code, a promising near-optimal error correction code, is also being widely considered in next generation communication systems. In this paper, we propose a simple method to construct a regular quasicyclic low-density parity-check (QC-LDPC code to improve the transmission performance over the precoded MC-CDMA system with limited feedback. Simulation results show that the coding gain of the proposed QC-LDPC codes is larger than that of the Reed-Solomon codes, and the performance of the multiantenna MC-CDMA system can be greatly improved by these QC-LDPC codes when the data rate is high.

  11. Advanced Design of Dumbbell-shaped Genetic Minimal Vectors Improves Non-coding and Coding RNA Expression.

    Science.gov (United States)

    Jiang, Xiaoou; Yu, Han; Teo, Cui Rong; Tan, Genim Siu Xian; Goh, Sok Chin; Patel, Parasvi; Chua, Yiqiang Kevin; Hameed, Nasirah Banu Sahul; Bertoletti, Antonio; Patzel, Volker

    2016-09-01

    Dumbbell-shaped DNA minimal vectors lacking nontherapeutic genes and bacterial sequences are considered a stable, safe alternative to viral, nonviral, and naked plasmid-based gene-transfer systems. We investigated novel molecular features of dumbbell vectors aiming to reduce vector size and to improve the expression of noncoding or coding RNA. We minimized small hairpin RNA (shRNA) or microRNA (miRNA) expressing dumbbell vectors in size down to 130 bp generating the smallest genetic expression vectors reported. This was achieved by using a minimal H1 promoter with integrated transcriptional terminator transcribing the RNA hairpin structure around the dumbbell loop. Such vectors were generated with high conversion yields using a novel protocol. Minimized shRNA-expressing dumbbells showed accelerated kinetics of delivery and transcription leading to enhanced gene silencing in human tissue culture cells. In primary human T cells, minimized miRNA-expressing dumbbells revealed higher stability and triggered stronger target gene suppression as compared with plasmids and miRNA mimics. Dumbbell-driven gene expression was enhanced up to 56- or 160-fold by implementation of an intron and the SV40 enhancer compared with control dumbbells or plasmids. Advanced dumbbell vectors may represent one option to close the gap between durable expression that is achievable with integrating viral vectors and short-term effects triggered by naked RNA.

  12. Coding-Spreading Tradeoff in CDMA Systems

    National Research Council Canada - National Science Library

    Bolas, Eduardo

    2002-01-01

    .... Comparing different combinations of coding and spreading with a traditional DS-CDMA, as defined in the IS-95 standard, allows the criteria to be defined for the best coding-spreading tradeoff in CDMA systems...

  13. Module type plant system dynamics analysis code (MSG-COPD). Code manual

    International Nuclear Information System (INIS)

    Sakai, Takaaki

    2002-11-01

    MSG-COPD is a module type plant system dynamics analysis code which involves a multi-dimensional thermal-hydraulics calculation module to analyze pool type of fast breeder reactors. Explanations of each module and the methods for the input data are described in this code manual. (author)

  14. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  15. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  16. Accelerator-driven transmutation reactor analysis code system (ATRAS)

    Energy Technology Data Exchange (ETDEWEB)

    Sasa, Toshinobu; Tsujimoto, Kazufumi; Takizuka, Takakazu; Takano, Hideki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-03-01

    JAERI is proceeding a design study of the hybrid type minor actinide transmutation system which mainly consist of an intense proton accelerator and a fast subcritical core. Neutronics and burnup characteristics of the accelerator-driven system is important from a view point of the maintenance of subcriticality and energy balance during the system operation. To determine those characteristics accurately, it is necessary to involve reactions at high-energy region, which are not treated on ordinary reactor analysis codes. The authors developed a code system named ATRAS to analyze the neutronics and burnup characteristics of accelerator-driven subcritical reactor systems. ATRAS has a function of burnup analysis taking account of the effect of spallation neutron source. ATRAS consists of a spallation analysis code, a neutron transport codes and a burnup analysis code. Utility programs for fuel exchange, pre-processing and post-processing are also incorporated. (author)

  17. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    International Nuclear Information System (INIS)

    Baratta, A.J.

    1997-01-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together

  18. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    Energy Technology Data Exchange (ETDEWEB)

    Baratta, A.J.

    1997-07-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.

  19. System code improvements for modelling passive safety systems and their validation

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, Sebastian; Cron, Daniel von der; Schaffrath, Andreas [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    GRS has been developing the system code ATHLET over many years. Because ATHLET, among other codes, is widely used in nuclear licensing and supervisory procedures, it has to represent the current state of science and technology. New reactor concepts such as Generation III+ and IV reactors and SMR are using passive safety systems intensively. The simulation of passive safety systems with the GRS system code ATHLET is still a big challenge, because of non-defined operation points and self-setting operation conditions. Additionally, the driving forces of passive safety systems are smaller and uncertainties of parameters have a larger impact than for active systems. This paper addresses the code validation and qualification work of ATHLET on the example of slightly inclined horizontal heat exchangers, which are e. g. used as emergency condensers (e. g. in the KERENA and the CAREM) or as heat exchanger in the passive auxiliary feed water systems (PAFS) of the APR+.

  20. Assessment of systems codes and their coupling with CFD codes in thermal–hydraulic applications to innovative reactors

    Energy Technology Data Exchange (ETDEWEB)

    Bandini, G., E-mail: giacomino.bandini@enea.it [Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) (Italy); Polidori, M. [Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) (Italy); Gerschenfeld, A.; Pialla, D.; Li, S. [Commissariat à l’Energie Atomique (CEA) (France); Ma, W.M.; Kudinov, P.; Jeltsov, M.; Kööp, K. [Royal Institute of Technology (KTH) (Sweden); Huber, K.; Cheng, X.; Bruzzese, C.; Class, A.G.; Prill, D.P. [Karlsruhe Institute of Technology (KIT) (Germany); Papukchiev, A. [Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) (Germany); Geffray, C.; Macian-Juan, R. [Technische Universität München (TUM) (Germany); Maas, L. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN) (France)

    2015-01-15

    Highlights: • The assessment of RELAP5, TRACE and CATHARE system codes on integral experiments is presented. • Code benchmark of CATHARE, DYN2B, and ATHLET on PHENIX natural circulation experiment. • Grid-free pool modelling based on proper orthogonal decomposition for system codes is explained. • The code coupling methodologies are explained. • The coupling of several CFD/system codes is tested against integral experiments. - Abstract: The THINS project of the 7th Framework EU Program on nuclear fission safety is devoted to the investigation of crosscutting thermal–hydraulic issues for innovative nuclear systems. A significant effort in the project has been dedicated to the qualification and validation of system codes currently employed in thermal–hydraulic transient analysis for nuclear reactors. This assessment is based either on already available experimental data, or on the data provided by test campaigns carried out in the frame of THINS project activities. Data provided by TALL and CIRCE facilities were used in the assessment of system codes for HLM reactors, while the PHENIX ultimate natural circulation test was used as reference for a benchmark exercise among system codes for sodium-cooled reactor applications. In addition, a promising grid-free pool model based on proper orthogonal decomposition is proposed to overcome the limits shown by the thermal–hydraulic system codes in the simulation of pool-type systems. Furthermore, multi-scale system-CFD solutions have been developed and validated for innovative nuclear system applications. For this purpose, data from the PHENIX experiments have been used, and data are provided by the tests conducted with new configuration of the TALL-3D facility, which accommodates a 3D test section within the primary circuit. The TALL-3D measurements are currently used for the validation of the coupling between system and CFD codes.

  1. Enhanced fault-tolerant quantum computing in d-level systems.

    Science.gov (United States)

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  2. The management-retrieval code of nuclear level density sub-library (CENPL-NLD)

    International Nuclear Information System (INIS)

    Ge Zhigang; Su Zongdi; Huang Zhongfu; Dong Liaoyuan

    1995-01-01

    The management-retrieval code of the Nuclear Level Density (NLD) is presented. It contains two retrieval ways: single nucleus (SN) and neutron reaction (NR). The latter contains four kinds of retrieval types. This code not only can retrieve level density parameter and the data related to the level density, but also can calculate the relevant data by using different level density parameters and do comparison of the calculated results with related data in order to help user to select level density parameters

  3. Manchester Coding Option for SpaceWire: Providing Choices for System Level Design

    Science.gov (United States)

    Rakow, Glenn; Kisin, Alex

    2014-01-01

    This paper proposes an optional coding scheme for SpaceWire in lieu of the current Data Strobe scheme for three reasons. First reason is to provide a straightforward method for electrical isolation of the interface; secondly to provide ability to reduce the mass and bend radius of the SpaceWire cable; and thirdly to provide a means for a common physical layer over which multiple spacecraft onboard data link protocols could operate for a wide range of data rates. The intent is to accomplish these goals without significant change to existing SpaceWire design investments. The ability to optionally use Manchester coding in place of the current Data Strobe coding provides the ability to DC balanced the signal transitions unlike the SpaceWire Data Strobe coding; and therefore the ability to isolate the electrical interface without concern. Additionally, because the Manchester code has the clock and data encoded on the same signal, the number of wires of the existing SpaceWire cable could be optionally reduced by 50. This reduction could be an important consideration for many users of SpaceWire as indicated by the already existing effort underway by the SpaceWire working group to reduce the cable mass and bend radius by elimination of shields. However, reducing the signal count by half would provide even greater gains. It is proposed to restrict the data rate for the optional Manchester coding to a fixed data rate of 10 Megabits per second (Mbps) in order to make the necessary changes simple and still able to run in current radiation tolerant Field Programmable Gate Arrays (FPGAs). Even with this constraint, 10 Mbps will meet many applications where SpaceWire is used. These include command and control applications and many instruments applications with have moderate data rate. For most NASA flight implementations, SpaceWire designs are in rad-tolerant FPGAs, and the desire to preserve the heritage design investment is important for cost and risk considerations. The

  4. Emergence of a code in the polymerization of amino acids along RNA templates.

    Directory of Open Access Journals (Sweden)

    Jean Lehmann

    2009-06-01

    Full Text Available The origin of the genetic code in the context of an RNA world is a major problem in the field of biophysical chemistry. In this paper, we describe how the polymerization of amino acids along RNA templates can be affected by the properties of both molecules. Considering a system without enzymes, in which the tRNAs (the translation adaptors are not loaded selectively with amino acids, we show that an elementary translation governed by a Michaelis-Menten type of kinetics can follow different polymerization regimes: random polymerization, homopolymerization and coded polymerization. The regime under which the system is running is set by the relative concentrations of the amino acids and the kinetic constants involved. We point out that the coding regime can naturally occur under prebiotic conditions. It generates partially coded proteins through a mechanism which is remarkably robust against non-specific interactions (mismatches between the adaptors and the RNA template. Features of the genetic code support the existence of this early translation system.

  5. Study on the code system for the off-site consequences assessment of severe nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sora; Mn, Byung Il; Park, Ki Hyun; Yang, Byung Mo; Suh, Kyung Suk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-12-15

    The importance of severe nuclear accidents and probabilistic safety assessment (PSA) were brought to international attention with the occurrence of severe nuclear accidents caused by the extreme natural disaster at Fukushima Daiichi nuclear power plant in Japan. In Korea, studies on level 3 PSA had made little progress until recently. The code systems of level 3 PSA, MACCS2 (MELCORE Accident Consequence Code System 2, US), COSYMA (COde SYstem from MAria, EU) and OSCAAR (Off-Site Consequence Analysis code for Atmospheric Releases in reactor accidents, JAPAN), were reviewed in this study, and the disadvantages and limitations of MACCS2 were also analyzed. Experts from Korea and abroad pointed out that the limitations of MACCS2 include the following: MACCS2 cannot simulate multi-unit accidents/release from spent fuel pools, and its atmospheric dispersion is based on a simple Gaussian plume model. Some of these limitations have been improved in the updated versions of MACCS2. The absence of a marine and aquatic dispersion model and the limited simulating range of food-chain and economic models are also important aspects that need to be improved. This paper is expected to be utilized as basic research material for developing a Korean code system for assessing off-site consequences of severe nuclear accidents.

  6. Study on the code system for the off-site consequences assessment of severe nuclear accident

    International Nuclear Information System (INIS)

    Kim, Sora; Mn, Byung Il; Park, Ki Hyun; Yang, Byung Mo; Suh, Kyung Suk

    2016-01-01

    The importance of severe nuclear accidents and probabilistic safety assessment (PSA) were brought to international attention with the occurrence of severe nuclear accidents caused by the extreme natural disaster at Fukushima Daiichi nuclear power plant in Japan. In Korea, studies on level 3 PSA had made little progress until recently. The code systems of level 3 PSA, MACCS2 (MELCORE Accident Consequence Code System 2, US), COSYMA (COde SYstem from MAria, EU) and OSCAAR (Off-Site Consequence Analysis code for Atmospheric Releases in reactor accidents, JAPAN), were reviewed in this study, and the disadvantages and limitations of MACCS2 were also analyzed. Experts from Korea and abroad pointed out that the limitations of MACCS2 include the following: MACCS2 cannot simulate multi-unit accidents/release from spent fuel pools, and its atmospheric dispersion is based on a simple Gaussian plume model. Some of these limitations have been improved in the updated versions of MACCS2. The absence of a marine and aquatic dispersion model and the limited simulating range of food-chain and economic models are also important aspects that need to be improved. This paper is expected to be utilized as basic research material for developing a Korean code system for assessing off-site consequences of severe nuclear accidents

  7. Development of SAGE, A computer code for safety assessment analyses for Korean Low-Level Radioactive Waste Disposal

    International Nuclear Information System (INIS)

    Zhou, W.; Kozak, Matthew W.; Park, Joowan; Kim, Changlak; Kang, Chulhyung

    2002-01-01

    This paper describes a computer code, called SAGE (Safety Assessment Groundwater Evaluation) to be used for evaluation of the concept for low-level waste disposal in the Republic of Korea (ROK). The conceptual model in the code is focused on releases from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. Doses can be calculated for several biosphere systems including drinking contaminated groundwater, and subsequent contamination of foods, rivers, lakes, or the ocean by that groundwater. The flexibility of the code will permit both generic analyses in support of design and site development activities, and straightforward modification to permit site-specific and design-specific safety assessments of a real facility as progress is made toward implementation of a disposal site. In addition, the code has been written to easily interface with more detailed codes for specific parts of the safety assessment. In this way, the code's capabilities can be significantly expanded as needed. The code has the capability to treat input parameters either deterministic ally or probabilistic ally. Parameter input is achieved through a user-friendly Graphical User Interface.

  8. BER performance comparison of optical CDMA systems with/without turbo codes

    Science.gov (United States)

    Kulkarni, Muralidhar; Chauhan, Vijender S.; Dutta, Yashpal; Sinha, Ravindra K.

    2002-08-01

    In this paper, we have analyzed and simulated the BER performance of a turbo coded optical code-division multiple-access (TC-OCDMA) system. A performance comparison has been made between uncoded OCDMA and TC-OCDMA systems employing various OCDMA address codes (optical orthogonal codes (OOCs), Generalized Multiwavelength Prime codes (GMWPC's), and Generalized Multiwavelength Reed Solomon code (GMWRSC's)). The BER performance of TC-OCDMA systems has been analyzed and simulated by varying the code weight of address code employed by the system. From the simulation results, it is observed that lower weight address codes can be employed for TC-OCDMA systems that can have the equivalent BER performance of uncoded systems employing higher weight address codes for a fixed number of active users.

  9. Partitioning of genetic variation between regulatory and coding gene segments: the predominance of software variation in genes encoding introvert proteins.

    Science.gov (United States)

    Mitchison, A

    1997-01-01

    In considering genetic variation in eukaryotes, a fundamental distinction can be made between variation in regulatory (software) and coding (hardware) gene segments. For quantitative traits the bulk of variation, particularly that near the population mean, appears to reside in regulatory segments. The main exceptions to this rule concern proteins which handle extrinsic substances, here termed extrovert proteins. The immune system includes an unusually large proportion of this exceptional category, but even so its chief source of variation may well be polymorphism in regulatory gene segments. The main evidence for this view emerges from genome scanning for quantitative trait loci (QTL), which in the case of the immune system points to a major contribution of pro-inflammatory cytokine genes. Further support comes from sequencing of major histocompatibility complex (Mhc) class II promoters, where a high level of polymorphism has been detected. These Mhc promoters appear to act, in part at least, by gating the back-signal from T cells into antigen-presenting cells. Both these forms of polymorphism are likely to be sustained by the need for flexibility in the immune response. Future work on promoter polymorphism is likely to benefit from the input from genome informatics.

  10. Study of nuclear computer code maintenance and management system

    International Nuclear Information System (INIS)

    Ryu, Chang Mo; Kim, Yeon Seung; Eom, Heung Seop; Lee, Jong Bok; Kim, Ho Joon; Choi, Young Gil; Kim, Ko Ryeo

    1989-01-01

    Software maintenance is one of the most important problems since late 1970's.We wish to develop a nuclear computer code system to maintenance and manage KAERI's nuclear software. As a part of this system, we have developed three code management programs for use on CYBER and PC systems. They are used in systematic management of computer code in KAERI. The first program is embodied on the CYBER system to rapidly provide information on nuclear codes to the users. The second and the third programs were embodied on the PC system for the code manager and for the management of data in korean language, respectively. In the requirement analysis, we defined each code, magnetic tape, manual and abstract information data. In the conceptual design, we designed retrieval, update, and output functions. In the implementation design, we described the technical considerations of database programs, utilities, and directions for the use of databases. As a result of this research, we compiled the status of nuclear computer codes which belonged KAERI until September, 1988. Thus, by using these three database programs, we could provide the nuclear computer code information to the users more rapidly. (Author)

  11. A Stress-Induced Bias in the Reading of the Genetic Code in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Adi Oron-Gottesman

    2016-11-01

    Full Text Available Escherichia coli mazEF is an extensively studied stress-induced toxin-antitoxin (TA system. The toxin MazF is an endoribonuclease that cleaves RNAs at ACA sites. Thereby, under stress, the induced MazF generates a stress-induced translation machinery (STM, composed of MazF-processed mRNAs and selective ribosomes that specifically translate the processed mRNAs. Here, we further characterized the STM system, finding that MazF cleaves only ACA sites located in the open reading frames of processed mRNAs, while out-of-frame ACAs are resistant. This in-frame ACA cleavage of MazF seems to depend on MazF binding to an extracellular-death-factor (EDF-like element in ribosomal protein bS1 (bacterial S1, apparently causing MazF to be part of STM ribosomes. Furthermore, due to the in-frame MazF cleavage of ACAs under stress, a bias occurs in the reading of the genetic code causing the amino acid threonine to be encoded only by its synonym codon ACC, ACU, or ACG, instead of by ACA.

  12. Improved Genetic Algorithm with Two-Level Approximation for Truss Optimization by Using Discrete Shape Variables

    Directory of Open Access Journals (Sweden)

    Shen-yan Chen

    2015-01-01

    Full Text Available This paper presents an Improved Genetic Algorithm with Two-Level Approximation (IGATA to minimize truss weight by simultaneously optimizing size, shape, and topology variables. On the basis of a previously presented truss sizing/topology optimization method based on two-level approximation and genetic algorithm (GA, a new method for adding shape variables is presented, in which the nodal positions are corresponding to a set of coordinate lists. A uniform optimization model including size/shape/topology variables is established. First, a first-level approximate problem is constructed to transform the original implicit problem to an explicit problem. To solve this explicit problem which involves size/shape/topology variables, GA is used to optimize individuals which include discrete topology variables and shape variables. When calculating the fitness value of each member in the current generation, a second-level approximation method is used to optimize the continuous size variables. With the introduction of shape variables, the original optimization algorithm was improved in individual coding strategy as well as GA execution techniques. Meanwhile, the update strategy of the first-level approximation problem was also improved. The results of numerical examples show that the proposed method is effective in dealing with the three kinds of design variables simultaneously, and the required computational cost for structural analysis is quite small.

  13. Modular ORIGEN-S for multi-physics code systems

    Energy Technology Data Exchange (ETDEWEB)

    Yesilyurt, Gokhan; Clarno, Kevin T.; Gauld, Ian C., E-mail: yesilyurtg@ornl.gov, E-mail: clarnokt@ornl.gov, E-mail: gauldi@ornl.gov [Oak Ridge National Laboratory, TN (United States); Galloway, Jack, E-mail: jack@galloways.net [Los Alamos National Laboratory, Los Alamos, NM (United States)

    2011-07-01

    The ORIGEN-S code in the SCALE 6.0 nuclear analysis code suite is a well-validated tool to calculate the time-dependent concentrations of nuclides due to isotopic depletion, decay, and transmutation for many systems in a wide range of time scales. Application areas include nuclear reactor and spent fuel storage analyses, burnup credit evaluations, decay heat calculations, and environmental assessments. Although simple to use within the SCALE 6.0 code system, especially with the ORIGEN-ARP graphical user interface, it is generally complex to use as a component within an externally developed code suite because of its tight coupling within the infrastructure of the larger SCALE 6.0 system. The ORIGEN2 code, which has been widely integrated within other simulation suites, is no longer maintained by Oak Ridge National Laboratory (ORNL), has obsolete data, and has a relatively small validation database. Therefore, a modular version of the SCALE/ORIGEN-S code was developed to simplify its integration with other software packages to allow multi-physics nuclear code systems to easily incorporate the well-validated isotopic depletion, decay, and transmutation capability to perform realistic nuclear reactor and fuel simulations. SCALE/ORIGEN-S was extensively restructured to develop a modular version that allows direct access to the matrix solvers embedded in the code. Problem initialization and the solver were segregated to provide a simple application program interface and fewer input/output operations for the multi-physics nuclear code systems. Furthermore, new interfaces were implemented to access and modify the ORIGEN-S input variables and nuclear cross-section data through external drivers. Three example drivers were implemented, in the C, C++, and Fortran 90 programming languages, to demonstrate the modular use of the new capability. This modular version of SCALE/ORIGEN-S has been embedded within several multi-physics software development projects at ORNL, including

  14. Modular ORIGEN-S for multi-physics code systems

    International Nuclear Information System (INIS)

    Yesilyurt, Gokhan; Clarno, Kevin T.; Gauld, Ian C.; Galloway, Jack

    2011-01-01

    The ORIGEN-S code in the SCALE 6.0 nuclear analysis code suite is a well-validated tool to calculate the time-dependent concentrations of nuclides due to isotopic depletion, decay, and transmutation for many systems in a wide range of time scales. Application areas include nuclear reactor and spent fuel storage analyses, burnup credit evaluations, decay heat calculations, and environmental assessments. Although simple to use within the SCALE 6.0 code system, especially with the ORIGEN-ARP graphical user interface, it is generally complex to use as a component within an externally developed code suite because of its tight coupling within the infrastructure of the larger SCALE 6.0 system. The ORIGEN2 code, which has been widely integrated within other simulation suites, is no longer maintained by Oak Ridge National Laboratory (ORNL), has obsolete data, and has a relatively small validation database. Therefore, a modular version of the SCALE/ORIGEN-S code was developed to simplify its integration with other software packages to allow multi-physics nuclear code systems to easily incorporate the well-validated isotopic depletion, decay, and transmutation capability to perform realistic nuclear reactor and fuel simulations. SCALE/ORIGEN-S was extensively restructured to develop a modular version that allows direct access to the matrix solvers embedded in the code. Problem initialization and the solver were segregated to provide a simple application program interface and fewer input/output operations for the multi-physics nuclear code systems. Furthermore, new interfaces were implemented to access and modify the ORIGEN-S input variables and nuclear cross-section data through external drivers. Three example drivers were implemented, in the C, C++, and Fortran 90 programming languages, to demonstrate the modular use of the new capability. This modular version of SCALE/ORIGEN-S has been embedded within several multi-physics software development projects at ORNL, including

  15. Improvement of JRR-4 core management code system

    International Nuclear Information System (INIS)

    Izumo, H.; Watanabe, S.; Nagatomi, H.; Hori, N.

    2000-01-01

    In the modification of JRR-4, the fuel was changed from 93% high enrichment uranium aluminized fuel to 20% low enriched uranium silicide fuel in conformity with the framework of reduced enrichment program on JAERI research reactors. As changing of this, JRR-4 core management code system which estimates excess reactivity of core, fuel burn-up and so on, was improved too. It had been difficult for users to operate the former code system because its input-output form was text-form. But, in the new code system (COMMAS-JRR), users are able to operate the code system without using difficult text-form input. The estimation results of excess reactivity of JRR-4 LEU fuel core were showed very good agreements with the measured value. It is the strong points of this new code system to be operated simply by using the windows form pictures act on a personal workstation equip with the graphical-user-interface (GUI), and to estimate accurately the specific characteristics of the LEU core. (author)

  16. Systems genetics of complex diseases using RNA-sequencing methods

    DEFF Research Database (Denmark)

    Mazzoni, Gianluca; Kogelman, Lisette; Suravajhala, Prashanth

    2015-01-01

    Next generation sequencing technologies have enabled the generation of huge quantities of biological data, and nowadays extensive datasets at different ‘omics levels have been generated. Systems genetics is a powerful approach that allows to integrate different ‘omics level and understand the bio...

  17. The application of LDPC code in MIMO-OFDM system

    Science.gov (United States)

    Liu, Ruian; Zeng, Beibei; Chen, Tingting; Liu, Nan; Yin, Ninghao

    2018-03-01

    The combination of MIMO and OFDM technology has become one of the key technologies of the fourth generation mobile communication., which can overcome the frequency selective fading of wireless channel, increase the system capacity and improve the frequency utilization. Error correcting coding introduced into the system can further improve its performance. LDPC (low density parity check) code is a kind of error correcting code which can improve system reliability and anti-interference ability, and the decoding is simple and easy to operate. This paper mainly discusses the application of LDPC code in MIMO-OFDM system.

  18. The PASC-3 code system and the UNIPASC environment

    International Nuclear Information System (INIS)

    Pijlgroms, B.J.; Oppe, J.; Oudshoorn, H.

    1991-08-01

    A brief description is given of the PASC-3 (Petten-AMPX-SCALE) Reactor Physics code system and its associated UNIPASC work environment. The PASC-3 code system is used for criticality and reactor calculations and consists of a selection from the Oak Ridge National Laboratory AMPX-SCALE-3 code collection complemented with a number of additional codes and nuclear data bases. The original codes have been adapted to run under the UNIX operating system. The recommended nuclear data base is a complete 219 group cross section library derived from JEF-1 of which some benchmark results are presented. By the addition of the UNIPASC work environment the usage of the code system is greatly simplified, Complex chains of programs can easily be coupled together to form a single job. In addition, the model parameters can be represented by variables instead of literal values which enhances the readability and may improve the integrity of the code inputs. (author). 8 refs.; 6 figs.; 1 tab

  19. On the genetic effects of low-level tritium

    International Nuclear Information System (INIS)

    Hori, Tada-aka; Nakai, Sayaka

    1976-01-01

    Genetic risk assessment for potential hazard from environmental tritium to man becomes important with increasing nuclear-power industry. The purpose of this short review is to discuss the possible genetic effects of tritium from a view of genetic risk estimation. The discussion is based mainly on our experimental results on the chromosome aberrations induced in human lymphocytes by tritium at the very low-level. The types of chromosome aberrations induced by radiation from tritium incorporated into the cells are mostly chromatid types. The most interesting finding is that the dose-response relationship observed in both tritiated-water and tritiated-thymidine is composed of two phases. The examination on the nature of two-phase dose-response relationship is very important not only for the mechanisms of chromosome aberrations, but also for the evaluation of genetic risk from low-level radiation. (auth.)

  20. Genetic damage from low-level and natural background radiation

    International Nuclear Information System (INIS)

    Oftedal, P.

    1988-01-01

    Relevant predictions that have been made of possible low level biological effects on man are reviewed, and the estimate of genetic damage is discussed. It is concluded that in spite of a number of attempts, no clear-cut case of effects in human populations of radiation at natural levels has been demonstrated. The stability of genetic material is dynamic, with damage, repair and selection running as continuous processes. Genetic materials are well protected and are conservative in the extreme, not least because evolution by genetic adaptation is an expensive process: Substitution of one allele A 1 by another A 2 means the death of the whole A 1 population

  1. Dimensioning BCH codes for coherent DQPSK systems with laser phase noise and cycle slips

    DEFF Research Database (Denmark)

    Leong, Miu Yoong; Larsen, Knud J.; Jacobsen, Gunnar

    2014-01-01

    Forward error correction (FEC) plays a vital role in coherent optical systems employing multi-level modulation. However, much of coding theory assumes that additive white Gaussian noise (AWGN) is dominant, whereas coherent optical systems have significant phase noise (PN) in addition to AWGN...... approach for a target post-FEC BER of 10-5. Codes dimensioned with our bivariate binomial model meet the target within 0.2-dB signal-to-noise ratio....

  2. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  3. I-Ching, dyadic groups of binary numbers and the geno-logic coding in living bodies.

    Science.gov (United States)

    Hu, Zhengbing; Petoukhov, Sergey V; Petukhova, Elena S

    2017-12-01

    The ancient Chinese book I-Ching was written a few thousand years ago. It introduces the system of symbols Yin and Yang (equivalents of 0 and 1). It had a powerful impact on culture, medicine and science of ancient China and several other countries. From the modern standpoint, I-Ching declares the importance of dyadic groups of binary numbers for the Nature. The system of I-Ching is represented by the tables with dyadic groups of 4 bigrams, 8 trigrams and 64 hexagrams, which were declared as fundamental archetypes of the Nature. The ancient Chinese did not know about the genetic code of protein sequences of amino acids but this code is organized in accordance with the I-Ching: in particularly, the genetic code is constructed on DNA molecules using 4 nitrogenous bases, 16 doublets, and 64 triplets. The article also describes the usage of dyadic groups as a foundation of the bio-mathematical doctrine of the geno-logic code, which exists in parallel with the known genetic code of amino acids but serves for a different goal: to code the inherited algorithmic processes using the logical holography and the spectral logic of systems of genetic Boolean functions. Some relations of this doctrine with the I-Ching are discussed. In addition, the ratios of musical harmony that can be revealed in the parameters of DNA structure are also represented in the I-Ching book. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. L-type calcium channels refine the neural population code of sound level

    Science.gov (United States)

    Grimsley, Calum Alex; Green, David Brian

    2016-01-01

    The coding of sound level by ensembles of neurons improves the accuracy with which listeners identify how loud a sound is. In the auditory system, the rate at which neurons fire in response to changes in sound level is shaped by local networks. Voltage-gated conductances alter local output by regulating neuronal firing, but their role in modulating responses to sound level is unclear. We tested the effects of L-type calcium channels (CaL: CaV1.1–1.4) on sound-level coding in the central nucleus of the inferior colliculus (ICC) in the auditory midbrain. We characterized the contribution of CaL to the total calcium current in brain slices and then examined its effects on rate-level functions (RLFs) in vivo using single-unit recordings in awake mice. CaL is a high-threshold current and comprises ∼50% of the total calcium current in ICC neurons. In vivo, CaL activates at sound levels that evoke high firing rates. In RLFs that increase monotonically with sound level, CaL boosts spike rates at high sound levels and increases the maximum firing rate achieved. In different populations of RLFs that change nonmonotonically with sound level, CaL either suppresses or enhances firing at sound levels that evoke maximum firing. CaL multiplies the gain of monotonic RLFs with dynamic range and divides the gain of nonmonotonic RLFs with the width of the RLF. These results suggest that a single broad class of calcium channels activates enhancing and suppressing local circuits to regulate the sensitivity of neuronal populations to sound level. PMID:27605536

  5. Using machine-coded event data for the micro-level study of political violence

    Directory of Open Access Journals (Sweden)

    Jesse Hammond

    2014-07-01

    Full Text Available Machine-coded datasets likely represent the future of event data analysis. We assess the use of one of these datasets—Global Database of Events, Language and Tone (GDELT—for the micro-level study of political violence by comparing it to two hand-coded conflict event datasets. Our findings indicate that GDELT should be used with caution for geo-spatial analyses at the subnational level: its overall correlation with hand-coded data is mediocre, and at the local level major issues of geographic bias exist in how events are reported. Overall, our findings suggest that due to these issues, researchers studying local conflict processes may want to wait for a more reliable geocoding method before relying too heavily on this set of machine-coded data.

  6. RELAP5/MOD3 code manual: Code structure, system models, and solution methods. Volume 1

    International Nuclear Information System (INIS)

    1995-08-01

    The RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents, and operational transients, such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling, approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. RELAP5/MOD3 code documentation is divided into seven volumes: Volume I provides modeling theory and associated numerical schemes

  7. Sequence Coding and Search System for licensee event reports: code listings. Volume 2

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This system provides a structured format for detailed coding of component, system, and unit effects as well as personnel errors. The database contains all current LERs submitted by nuclear power plant utilities for events occurring since 1981 and is updated on a continual basis. Volume 2 contains all valid and acceptable codes used for searching and encoding the LER data. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 2

  8. Manual for IRS Coding. Joint IAEA/NEA International Reporting System for Operating Experience

    International Nuclear Information System (INIS)

    2011-01-01

    The International Reporting System for Operating Experience (IRS) is jointly operated by the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA). In early 2010, the IAEA and OECD/NEA jointly issued the IRS Guidelines, which described the reporting system and process and gave users the necessary elements to enable them to produce IRS reports to a high standard of quality while retaining the effectiveness of the system expected by all Member States operating nuclear power plants. The purpose of the present Manual for IRS Coding is to provide supplementary guidance specifically on the coding element of IRS reports to ensure uniform coding of events that are reported through IRS. This Coding Manual does not supersede the IRS Guidelines, but rather, supports users and preparers in achieving a consistent and high level of quality in their IRS reports. Consistency and high quality in the IRS reports allow stakeholders to search and retrieve specific event information with ease. In addition, well-structured reports also enhance the efficient management of the IRS database. This Coding Manual will give specific guidance on the application of each section of the IRS codes, with examples where necessary, of when and how these codes are to be applied. As this reporting system is owned by the Member States, this manual has been developed and approved by the IRS National Coordinators with the assistance of the IAEA and NEA secretariats

  9. Development of the next generation reactor analysis code system, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Nagaya, Yasunobu; Chiba, Go; Kugo, Teruhiko; Ishikawa, Makoto; Tatsumi, Masahiro; Hirai, Yasushi; Hyoudou, Hideaki; Numata, Kazuyuki; Iwai, Takehiko; Jin, Tomoyuki

    2011-03-01

    A next generation reactor analysis code system, MARBLE, has been developed. MARBLE is a successor of the fast reactor neutronics analysis code systems, JOINT-FR and SAGEP-FR (conventional systems), which were developed for so-called JUPITER standard analysis methods. MARBLE has the equivalent analysis capability to the conventional system because MARBLE can utilize sub-codes included in the conventional system without any change. On the other hand, burnup analysis functionality for power reactors is improved compared with the conventional system by introducing models on fuel exchange treatment and control rod operation and so on. In addition, MARBLE has newly developed solvers and some new features of burnup calculation by the Krylov sub-space method and nuclear design accuracy evaluation by the extended bias factor method. In the development of MARBLE, the object oriented technology was adopted from the view-point of improvement of the software quality such as flexibility, expansibility, facilitation of the verification by the modularization and assistance of co-development. And, software structure called the two-layer system consisting of scripting language and system development language was applied. As a result, MARBLE is not an independent analysis code system which simply receives input and returns output, but an assembly of components for building an analysis code system (i.e. framework). Furthermore, MARBLE provides some pre-built analysis code systems such as the fast reactor neutronics analysis code system. SCHEME, which corresponds to the conventional code and the fast reactor burnup analysis code system, ORPHEUS. (author)

  10. A New Coding System for Metabolic Disorders Demonstrates Gaps in the International Disease Classifications ICD-10 and SNOMED-CT, Which Can Be Barriers to Genotype-Phenotype Data Sharing

    NARCIS (Netherlands)

    Sollie, Annet; Sijmons, Rolf H.; Lindhout, Dick; van der Ploeg, Ans T.; Gozalbo, M. Estela Rubio; Smit, G. Peter A.; Verheijen, Frans; Waterham, Hans R.; van Weely, Sonja; Wijburg, Frits A.; Wijburg, Rudolph; Visser, Gepke

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of

  11. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  12. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  13. Integrated Validation System for a Thermal-hydraulic System Code, TASS/SMR-S

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee-Kyung; Kim, Hyungjun; Kim, Soo Hyoung; Hwang, Young-Dong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Hyeon-Soo [Chungnam National University, Daejeon (Korea, Republic of)

    2015-10-15

    Development including enhancement and modification of thermal-hydraulic system computer code is indispensable to a new reactor, SMART. Usually, a thermal-hydraulic system code validation is achieved by a comparison with the results of corresponding physical effect tests. In the reactor safety field, a similar concept, referred to as separate effect tests has been used for a long time. But there are so many test data for comparison because a lot of separate effect tests and integral effect tests are required for a code validation. It is not easy to a code developer to validate a computer code whenever a code modification is occurred. IVS produces graphs which shown the comparison the code calculation results with the corresponding test results automatically. IVS was developed for a validation of TASS/SMR-S code. The code validation could be achieved by a comparison code calculation results with corresponding test results. This comparison was represented as a graph for convenience. IVS is useful before release a new code version. The code developer can validate code result easily using IVS. Even during code development, IVS could be used for validation of code modification. The code developer could gain a confidence about his code modification easily and fast and could be free from tedious and long validation work. The popular software introduced in IVS supplies better usability and portability.

  14. Temporal code in the vibrissal system-Part II: Roughness surface discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Farfan, F D [Departamento de BioingenierIa, FACET, Universidad Nacional de Tucuman, INSIBIO - CONICET, CC 327, Postal Code CP 4000 (Argentina); AlbarracIn, A L [Catedra de Neurociencias, Facultad de Medicina, Universidad Nacional de Tucuman (Argentina); Felice, C J [Departamento de BioingenierIa, FACET, Universidad Nacional de Tucuman, INSIBIO - CONICET, CC 327, Postal Code CP 4000 (Argentina)

    2007-11-15

    Previous works have purposed hypotheses about the neural code of the tactile system in the rat. One of them is based on the physical characteristics of vibrissae, such as frequency of resonance; another is based on discharge patterns on the trigeminal ganglion. In this work, the purpose is to find a temporal code analyzing the afferent signals of two vibrissal nerves while vibrissae sweep surfaces of different roughness. Two levels of pressure were used between the vibrissa and the contact surface. We analyzed the afferent discharge of DELTA and GAMMA vibrissal nerves. The vibrissae movements were produced using electrical stimulation of the facial nerve. The afferent signals were analyzed using an event detection algorithm based on Continuous Wavelet Transform (CWT). The algorithm was able to detect events of different duration. The inter-event times detected were calculated for each situation and represented in box plot. This work allowed establishing the existence of a temporal code at peripheral level.

  15. Temporal code in the vibrissal system-Part II: Roughness surface discrimination

    International Nuclear Information System (INIS)

    Farfan, F D; AlbarracIn, A L; Felice, C J

    2007-01-01

    Previous works have purposed hypotheses about the neural code of the tactile system in the rat. One of them is based on the physical characteristics of vibrissae, such as frequency of resonance; another is based on discharge patterns on the trigeminal ganglion. In this work, the purpose is to find a temporal code analyzing the afferent signals of two vibrissal nerves while vibrissae sweep surfaces of different roughness. Two levels of pressure were used between the vibrissa and the contact surface. We analyzed the afferent discharge of DELTA and GAMMA vibrissal nerves. The vibrissae movements were produced using electrical stimulation of the facial nerve. The afferent signals were analyzed using an event detection algorithm based on Continuous Wavelet Transform (CWT). The algorithm was able to detect events of different duration. The inter-event times detected were calculated for each situation and represented in box plot. This work allowed establishing the existence of a temporal code at peripheral level

  16. Bar-code automated waste tracking system

    International Nuclear Information System (INIS)

    Hull, T.E.

    1994-10-01

    The Bar-Code Automated Waste Tracking System was designed to be a site-Specific program with a general purpose application for transportability to other facilities. The system is user-friendly, totally automated, and incorporates the use of a drive-up window that is close to the areas dealing in container preparation, delivery, pickup, and disposal. The system features ''stop-and-go'' operation rather than a long, tedious, error-prone manual entry. The system is designed for automation but allows operators to concentrate on proper handling of waste while maintaining manual entry of data as a backup. A large wall plaque filled with bar-code labels is used to input specific details about any movement of waste

  17. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  18. Source Code Vulnerabilities in IoT Software Systems

    Directory of Open Access Journals (Sweden)

    Saleh Mohamed Alnaeli

    2017-08-01

    Full Text Available An empirical study that examines the usage of known vulnerable statements in software systems developed in C/C++ and used for IoT is presented. The study is conducted on 18 open source systems comprised of millions of lines of code and containing thousands of files. Static analysis methods are applied to each system to determine the number of unsafe commands (e.g., strcpy, strcmp, and strlen that are well-known among research communities to cause potential risks and security concerns, thereby decreasing a system’s robustness and quality. These unsafe statements are banned by many companies (e.g., Microsoft. The use of these commands should be avoided from the start when writing code and should be removed from legacy code over time as recommended by new C/C++ language standards. Each system is analyzed and the distribution of the known unsafe commands is presented. Historical trends in the usage of the unsafe commands of 7 of the systems are presented to show how the studied systems evolved over time with respect to the vulnerable code. The results show that the most prevalent unsafe command used for most systems is memcpy, followed by strlen. These results can be used to help train software developers on secure coding practices so that they can write higher quality software systems.

  19. SALT [System Analysis Language Translater]: A steady state and dynamic systems code

    International Nuclear Information System (INIS)

    Berry, G.; Geyer, H.

    1983-01-01

    SALT (System Analysis Language Translater) is a lumped parameter approach to system analysis which is totally modular. The modules are all precompiled and only the main program, which is generated by SALT, needs to be compiled for each unique system configuration. This is a departure from other lumped parameter codes where all models are written by MACROS and then compiled for each unique configuration, usually after all of the models are lumped together and sorted to eliminate undetermined variables. The SALT code contains a robust and sophisticated steady-sate finder (non-linear equation solver), optimization capability and enhanced GEAR integration scheme which makes use of sparsity and algebraic constraints. The SALT systems code has been used for various technologies. The code was originally developed for open-cycle magnetohydrodynamic (MHD) systems. It was easily extended to liquid metal MHD systems by simply adding the appropriate models and property libraries. Similarly, the model and property libraries were expanded to handle fuel cell systems, flue gas desulfurization systems, combined cycle gasification systems, fluidized bed combustion systems, ocean thermal energy conversion systems, geothermal systems, nuclear systems, and conventional coal-fired power plants. Obviously, the SALT systems code is extremely flexible to be able to handle all of these diverse systems. At present, the dynamic option has only been used for LMFBR nuclear power plants and geothermal power plants. However, it can easily be extended to other systems and can be used for analyzing control problems. 12 refs

  20. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  1. Guidelines for selecting codes for ground-water transport modeling of low-level waste burial sites. Executive summary

    International Nuclear Information System (INIS)

    Simmons, C.S.; Cole, C.R.

    1985-05-01

    This document was written to provide guidance to managers and site operators on how ground-water transport codes should be selected for assessing burial site performance. There is a need for a formal approach to selecting appropriate codes from the multitude of potentially useful ground-water transport codes that are currently available. Code selection is a problem that requires more than merely considering mathematical equation-solving methods. These guidelines are very general and flexible and are also meant for developing systems simulation models to be used to assess the environmental safety of low-level waste burial facilities. Code selection is only a single aspect of the overall objective of developing a systems simulation model for a burial site. The guidance given here is mainly directed toward applications-oriented users, but managers and site operators need to be familiar with this information to direct the development of scientifically credible and defensible transport assessment models. Some specific advice for managers and site operators on how to direct a modeling exercise is based on the following five steps: identify specific questions and study objectives; establish costs and schedules for achieving answers; enlist the aid of professional model applications group; decide on approach with applications group and guide code selection; and facilitate the availability of site-specific data. These five steps for managers/site operators are discussed in detail following an explanation of the nine systems model development steps, which are presented first to clarify what code selection entails

  2. The Effects of Predator Evolution and Genetic Variation on Predator-Prey Population-Level Dynamics.

    Science.gov (United States)

    Cortez, Michael H; Patel, Swati

    2017-07-01

    This paper explores how predator evolution and the magnitude of predator genetic variation alter the population-level dynamics of predator-prey systems. We do this by analyzing a general eco-evolutionary predator-prey model using four methods: Method 1 identifies how eco-evolutionary feedbacks alter system stability in the fast and slow evolution limits; Method 2 identifies how the amount of standing predator genetic variation alters system stability; Method 3 identifies how the phase lags in predator-prey cycles depend on the amount of genetic variation; and Method 4 determines conditions for different cycle shapes in the fast and slow evolution limits using geometric singular perturbation theory. With these four methods, we identify the conditions under which predator evolution alters system stability and shapes of predator-prey cycles, and how those effect depend on the amount of genetic variation in the predator population. We discuss the advantages and disadvantages of each method and the relations between the four methods. This work shows how the four methods can be used in tandem to make general predictions about eco-evolutionary dynamics and feedbacks.

  3. Level of awareness of genetic counselling in Lagos, Nigeria: its ...

    African Journals Online (AJOL)

    Level of awareness of genetic counselling in Lagos, Nigeria: its advocacy on the inheritance of sickle cell disease. ... and the level of awareness about genetic counseling in 30 hospitals were carried out. ... AJOL African Journals Online.

  4. Coding response to a case-mix measurement system based on multiple diagnoses.

    Science.gov (United States)

    Preyra, Colin

    2004-08-01

    To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post.

  5. Coding Response to a Case-Mix Measurement System Based on Multiple Diagnoses

    Science.gov (United States)

    Preyra, Colin

    2004-01-01

    Objective To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. PMID:15230940

  6. Orion: Detecting regions of the human non-coding genome that are intolerant to variation using population genetics.

    Science.gov (United States)

    Gussow, Ayal B; Copeland, Brett R; Dhindsa, Ryan S; Wang, Quanli; Petrovski, Slavé; Majoros, William H; Allen, Andrew S; Goldstein, David B

    2017-01-01

    There is broad agreement that genetic mutations occurring outside of the protein-coding regions play a key role in human disease. Despite this consensus, we are not yet capable of discerning which portions of non-coding sequence are important in the context of human disease. Here, we present Orion, an approach that detects regions of the non-coding genome that are depleted of variation, suggesting that the regions are intolerant of mutations and subject to purifying selection in the human lineage. We show that Orion is highly correlated with known intolerant regions as well as regions that harbor putatively pathogenic variation. This approach provides a mechanism to identify pathogenic variation in the human non-coding genome and will have immediate utility in the diagnostic interpretation of patient genomes and in large case control studies using whole-genome sequences.

  7. SPRINT: A Tool to Generate Concurrent Transaction-Level Models from Sequential Code

    Directory of Open Access Journals (Sweden)

    Richard Stahl

    2007-01-01

    Full Text Available A high-level concurrent model such as a SystemC transaction-level model can provide early feedback during the exploration of implementation alternatives for state-of-the-art signal processing applications like video codecs on a multiprocessor platform. However, the creation of such a model starting from sequential code is a time-consuming and error-prone task. It is typically done only once, if at all, for a given design. This lack of exploration of the design space often leads to a suboptimal implementation. To support our systematic C-based design flow, we have developed a tool to generate a concurrent SystemC transaction-level model for user-selected task boundaries. Using this tool, different parallelization alternatives have been evaluated during the design of an MPEG-4 simple profile encoder and an embedded zero-tree coder. Generation plus evaluation of an alternative was possible in less than six minutes. This is fast enough to allow extensive exploration of the design space.

  8. Web- and system-code based, interactive, nuclear power plant simulators

    International Nuclear Information System (INIS)

    Kim, K. D.; Jain, P.; Rizwan, U.

    2006-01-01

    Using two different approaches, on-line, web- and system-code based graphical user interfaces have been developed for reactor system analysis. Both are LabVIEW (graphical programming language developed by National Instruments) based systems that allow local users as well as those at remote sites to run, interact and view the results of the system code in a web browser. In the first approach, only the data written by the system code in a tab separated ASCII output file is accessed and displayed graphically. In the second approach, LabVIEW virtual instruments are coupled with the system code as dynamic link libraries (DLL). RELAP5 is used as the system code to demonstrate the capabilities of these approaches. From collaborative projects between teams in geographically remote locations to providing system code experience to distance education students, these tools can be very beneficial in many areas of teaching and R and D. (authors)

  9. Genetic influences on level and stability of self-esteem

    OpenAIRE

    Neiss, Michelle; Sedikides, Constantine; Stevenson, Jim

    2006-01-01

    We attempted to clarify the relation between self-esteem level (high vs. low) and perceived self-esteem stability (within-person variability) by using a behavioral genetics approach. We tested whether the same or independent genetic and environmental influences impact on level and stability. Adolescent twin siblings (n = 183 pairs) completed level and stability scales at two time points. Heritability for both was substantial. The remaining variance in each was attributable to non-shared envir...

  10. Lossless, Near-Lossless, and Refinement Coding of Bi-level Images

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren Otto

    1997-01-01

    We present general and unified algorithms for lossy/lossless codingof bi-level images. The compression is realized by applying arithmetic coding to conditional probabilities. As in the current JBIG standard the conditioning may be specified by a template.For better compression, the more general....... Introducing only a small amount of loss in halftoned test images, compression is increased by up to a factor of four compared with JBIG. Lossy, lossless, and refinement decoding speed and lossless encoding speed are less than a factor of two slower than JBIG. The (de)coding method is proposed as part of JBIG......-2, an emerging international standard for lossless/lossy compression of bi-level images....

  11. Development of an advanced code system for fast-reactor transient analysis

    International Nuclear Information System (INIS)

    Konstantin Mikityuk; Sandro Pelloni; Paul Coddington

    2005-01-01

    FAST (Fast-spectrum Advanced Systems for power production and resource management) is a recently approved PSI activity in the area of fast spectrum core and safety analysis with emphasis on generic developments and Generation IV systems. In frames of the FAST project we will study both statics and transients core physics, reactor system behaviour and safety; related international experiments. The main current goal of the project is to develop unique analytical and code capability for core and safety analysis of critical (and sub-critical) fast spectrum systems with an initial emphasis on a gas cooled fast reactors. A structure of the code system is shown on Fig. 1. The main components of the FAST code system are 1) ERANOS code for preparation of basic x-sections and their partial derivatives; 2) PARCS transient nodal-method multi-group neutron diffusion code for simulation of spatial (3D) neutron kinetics in hexagonal and square geometries; 3) TRAC/AAA code for system thermal hydraulics; 4) FRED transient model for fuel thermal-mechanical behaviour; 5) PVM system as an interface between separate parts of the code system. The paper presents a structure of the code system (Fig. 1), organization of interfaces and data exchanges between main parts of the code system, examples of verification and application of separate codes and the system as a whole. (authors)

  12. Genetic factors influencing ferritin levels in 14,126 blood donors

    DEFF Research Database (Denmark)

    Sørensen, Erik; Rigas, Andreas S; Thørner, Lise W

    2015-01-01

    BACKGROUND: Many biologic functions depend on sufficient iron levels, and iron deficiency is especially common among blood donors. Genetic variants associated with iron levels have been identified, but the impact of genetic variation on iron levels among blood donors remains unclear. STUDY DESIGN...... AND METHODS: The effect of six single-nucleotide polymorphisms (SNPs) on ferritin levels in 14,126 blood donors were investigated in four genes: in Human Hemochromatosis Protein gene (HFE; rs1800562 and rs179945); in Transmembrane Protease gene, Serine 6 (TMPRSS6-regulating hepcidin; rs855791); in BTB domain...... with iron deficiency in women. Results for all other genetic variants were insignificant. CONCLUSION: Genetic variants associated with hemochromatosis may protect donors against depleted iron stores. In addition, we showed that presence of the T-allele at rs855791 in TMPRSS6 was associated with lower iron...

  13. Genetic Recombination Between Stromal and Cancer Cells Results in Highly Malignant Cells Identified by Color-Coded Imaging in a Mouse Lymphoma Model.

    Science.gov (United States)

    Nakamura, Miki; Suetsugu, Atsushi; Hasegawa, Kousuke; Matsumoto, Takuro; Aoki, Hitomi; Kunisada, Takahiro; Shimizu, Masahito; Saji, Shigetoyo; Moriwaki, Hisataka; Hoffman, Robert M

    2017-12-01

    The tumor microenvironment (TME) promotes tumor growth and metastasis. We previously established the color-coded EL4 lymphoma TME model with red fluorescent protein (RFP) expressing EL4 implanted in transgenic C57BL/6 green fluorescent protein (GFP) mice. Color-coded imaging of the lymphoma TME suggested an important role of stromal cells in lymphoma progression and metastasis. In the present study, we used color-coded imaging of RFP-lymphoma cells and GFP stromal cells to identify yellow-fluorescent genetically recombinant cells appearing only during metastasis. The EL4-RFP lymphoma cells were injected subcutaneously in C57BL/6-GFP transgenic mice and formed subcutaneous tumors 14 days after cell transplantation. The subcutaneous tumors were harvested and transplanted to the abdominal cavity of nude mice. Metastases to the liver, perigastric lymph node, ascites, bone marrow, and primary tumor were imaged. In addition to EL4-RFP cells and GFP-host cells, genetically recombinant yellow-fluorescent cells, were observed only in the ascites and bone marrow. These results indicate genetic exchange between the stromal and cancer cells. Possible mechanisms of genetic exchange are discussed as well as its ramifications for metastasis. J. Cell. Biochem. 118: 4216-4221, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. Code system for fast reactor neutronics analysis

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Abe, Junji; Sato, Wakaei.

    1983-04-01

    A code system for analysis of fast reactor neutronics has been developed for the purpose of handy use and error reduction. The JOINT code produces the input data file to be used in the neutronics calculation code and also prepares the cross section library file with an assigned format. The effective cross sections are saved in the PDS file with an unified format. At the present stage, this code system includes the following codes; SLAROM, ESELEM5, EXPANDA-G for the production of effective cross sections and CITATION-FBR, ANISN-JR, TWOTRAN2, PHENIX, 3DB, MORSE, CIPER and SNPERT. In the course of the development, some utility programs and service programs have been additionaly developed. These are used for access of PDS file, edit of the cross sections and graphic display. Included in this report are a description of input data format of the JOINT and other programs, and of the function of each subroutine and utility programs. The usage of PDS file is also explained. In Appendix A, the input formats are described for the revised version of the CIPER code. (author)

  15. Performance enhancement of successive interference cancellation scheme based on spectral amplitude coding for optical code-division multiple-access systems using Hadamard codes

    Science.gov (United States)

    Eltaif, Tawfig; Shalaby, Hossam M. H.; Shaari, Sahbudin; Hamarsheh, Mohammad M. N.

    2009-04-01

    A successive interference cancellation scheme is applied to optical code-division multiple-access (OCDMA) systems with spectral amplitude coding (SAC). A detailed analysis of this system, with Hadamard codes used as signature sequences, is presented. The system can easily remove the effect of the strongest signal at each stage of the cancellation process. In addition, simulation of the prose system is performed in order to validate the theoretical results. The system shows a small bit error rate at a large number of active users compared to the SAC OCDMA system. Our results reveal that the proposed system is efficient in eliminating the effect of the multiple-user interference and in the enhancement of the overall performance.

  16. ATHENA code manual. Volume 1. Code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Carlson, K.E.; Roth, P.A.; Ransom, V.H.

    1986-09-01

    The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code has been developed to perform transient simulation of the thermal hydraulic systems which may be found in fusion reactors, space reactors, and other advanced systems. A generic modeling approach is utilized which permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of a complete facility. Several working fluids are available to be used in one or more interacting loops. Different loops may have different fluids with thermal connections between loops. The modeling theory and associated numerical schemes are documented in Volume I in order to acquaint the user with the modeling base and thus aid effective use of the code. The second volume contains detailed instructions for input data preparation

  17. Genetic classes and genetic categories : Protecting genetic groups through data protection law

    NARCIS (Netherlands)

    Hallinan, Dara; de Hert, Paul; Taylor, L.; Floridi, L.; van der Sloot, B.

    2017-01-01

    Each person shares genetic code with others. Thus, one individual’s genome can reveal information about other individuals. When multiple individuals share aspects of genetic architecture, they form a ‘genetic group’. From a social and legal perspective, two types of genetic group exist: Those which

  18. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    Science.gov (United States)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  19. HELIAS module development for systems codes

    Energy Technology Data Exchange (ETDEWEB)

    Warmer, F., E-mail: Felix.Warmer@ipp.mpg.de; Beidler, C.D.; Dinklage, A.; Egorov, K.; Feng, Y.; Geiger, J.; Schauer, F.; Turkin, Y.; Wolf, R.; Xanthopoulos, P.

    2015-02-15

    In order to study and design next-step fusion devices such as DEMO, comprehensive systems codes are commonly employed. In this work HELIAS-specific models are proposed which are designed to be compatible with systems codes. The subsequently developed models include: a geometry model based on Fourier coefficients which can represent the complex 3-D plasma shape, a basic island divertor model which assumes diffusive cross-field transport and high radiation at the X-point, and a coil model which combines scaling aspects based on the HELIAS 5-B reactor design in combination with analytic inductance and field calculations. In addition, stellarator-specific plasma transport is discussed. A strategy is proposed which employs a predictive confinement time scaling derived from 1-D neoclassical and 3-D turbulence simulations. This paper reports on the progress of the development of the stellarator-specific models while an implementation and verification study within an existing systems code will be presented in a separate work. This approach is investigated to ultimately allow one to conduct stellarator system studies, develop design points of HELIAS burning plasma devices, and to facilitate a direct comparison between tokamak and stellarator DEMO and power plant designs.

  20. NALAP: an LMFBR system transient code

    International Nuclear Information System (INIS)

    Martin, B.A.; Agrawal, A.K.; Albright, D.C.; Epel, L.G.; Maise, G.

    1975-07-01

    NALAP is a LMFBR system transient code. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic response of sodium cooled fast breeder reactors when subjected to postulated accidents such as a massive pipe break as well as a variety of other upset conditions that do not disrupt the system geometry. Various components of the plant are represented by control volumes. These control volumes are connected by junctions some of which may be leak or fill junctions. The fluid flow equations are modeled as compressible, single-stream flow with momentum flux in one dimension. The transient response is computed by integrating the thermal-hydraulic conservation equations from user-initialized operating conditions by an implicit numerical scheme. Point kinetics approximation is used to represent the time dependent heat generation in the reactor core

  1. Systems level analysis of systemic sclerosis shows a network of immune and profibrotic pathways connected with genetic polymorphisms.

    Directory of Open Access Journals (Sweden)

    J Matthew Mahoney

    2015-01-01

    Full Text Available Systemic sclerosis (SSc is a rare systemic autoimmune disease characterized by skin and organ fibrosis. The pathogenesis of SSc and its progression are poorly understood. The SSc intrinsic gene expression subsets (inflammatory, fibroproliferative, normal-like, and limited are observed in multiple clinical cohorts of patients with SSc. Analysis of longitudinal skin biopsies suggests that a patient's subset assignment is stable over 6-12 months. Genetically, SSc is multi-factorial with many genetic risk loci for SSc generally and for specific clinical manifestations. Here we identify the genes consistently associated with the intrinsic subsets across three independent cohorts, show the relationship between these genes using a gene-gene interaction network, and place the genetic risk loci in the context of the intrinsic subsets. To identify gene expression modules common to three independent datasets from three different clinical centers, we developed a consensus clustering procedure based on mutual information of partitions, an information theory concept, and performed a meta-analysis of these genome-wide gene expression datasets. We created a gene-gene interaction network of the conserved molecular features across the intrinsic subsets and analyzed their connections with SSc-associated genetic polymorphisms. The network is composed of distinct, but interconnected, components related to interferon activation, M2 macrophages, adaptive immunity, extracellular matrix remodeling, and cell proliferation. The network shows extensive connections between the inflammatory- and fibroproliferative-specific genes. The network also shows connections between these subset-specific genes and 30 SSc-associated polymorphic genes including STAT4, BLK, IRF7, NOTCH4, PLAUR, CSK, IRAK1, and several human leukocyte antigen (HLA genes. Our analyses suggest that the gene expression changes underlying the SSc subsets may be long-lived, but mechanistically interconnected

  2. Channel coding for underwater acoustic single-carrier CDMA communication system

    Science.gov (United States)

    Liu, Lanjun; Zhang, Yonglei; Zhang, Pengcheng; Zhou, Lin; Niu, Jiong

    2017-01-01

    CDMA is an effective multiple access protocol for underwater acoustic networks, and channel coding can effectively reduce the bit error rate (BER) of the underwater acoustic communication system. For the requirements of underwater acoustic mobile networks based on CDMA, an underwater acoustic single-carrier CDMA communication system (UWA/SCCDMA) based on the direct-sequence spread spectrum is proposed, and its channel coding scheme is studied based on convolution, RA, Turbo and LDPC coding respectively. The implementation steps of the Viterbi algorithm of convolutional coding, BP and minimum sum algorithms of RA coding, Log-MAP and SOVA algorithms of Turbo coding, and sum-product algorithm of LDPC coding are given. An UWA/SCCDMA simulation system based on Matlab is designed. Simulation results show that the UWA/SCCDMA based on RA, Turbo and LDPC coding have good performance such that the communication BER is all less than 10-6 in the underwater acoustic channel with low signal to noise ratio (SNR) from -12 dB to -10dB, which is about 2 orders of magnitude lower than that of the convolutional coding. The system based on Turbo coding with Log-MAP algorithm has the best performance.

  3. Call for consistent coding in diabetes mellitus using the Royal College of General Practitioners and NHS pragmatic classification of diabetes

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2013-03-01

    Full Text Available Background The prevalence of diabetes is increasing with growing levels of obesity and an aging population. New practical guidelines for diabetes provide an applicable classification. Inconsistent coding of diabetes hampers the use of computerised disease registers for quality improvement, and limits the monitoring of disease trends.Objective To develop a consensus set of codes that should be used when recording diabetes diagnostic data.Methods The consensus approach was hierarchical, with a preference for diagnostic/disorder codes, to define each type of diabetes and non-diabetic hyperglycaemia, which were listed as being completely, partially or not readily mapped to available codes. The practical classification divides diabetes into type 1 (T1DM, type 2 (T2DM, genetic, other, unclassified and non-diabetic fasting hyperglycaemia. We mapped the classification to Read version 2, Clinical Terms version 3 and SNOMED CT.Results T1DMand T2DM were completely mapped to appropriate codes. However, in other areas only partial mapping is possible. Genetics is a fast-moving field and there were considerable gaps in the available labels for genetic conditions; what the classification calls ‘other’ the coding system labels ‘secondary’ diabetes. The biggest gap was the lack of a code for diabetes where the type of diabetes was uncertain. Notwithstanding these limitations we were able to develop a consensus list.Conclusions It is a challenge to develop codes that readily map to contemporary clinical concepts. However, clinicians should adopt the standard recommended codes; and audit the quality of their existing records.

  4. The Genomic Code: Genome Evolution and Potential Applications

    KAUST Repository

    Bernardi, Giorgio

    2016-01-25

    The genome of metazoans is organized according to a genomic code which comprises three laws: 1) Compositional correlations hold between contiguous coding and non-coding sequences, as well as among the three codon positions of protein-coding genes; these correlations are the consequence of the fact that the genomes under consideration consist of fairly homogeneous, long (≥200Kb) sequences, the isochores; 2) Although isochores are defined on the basis of purely compositional properties, GC levels of isochores are correlated with all tested structural and functional properties of the genome; 3) GC levels of isochores are correlated with chromosome architecture from interphase to metaphase; in the case of interphase the correlation concerns isochores and the three-dimensional “topological associated domains” (TADs); in the case of mitotic chromosomes, the correlation concerns isochores and chromosomal bands. Finally, the genomic code is the fourth and last pillar of molecular biology, the first three pillars being 1) the double helix structure of DNA; 2) the regulation of gene expression in prokaryotes; and 3) the genetic code.

  5. Improvement of level-1 PSA computer code package

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on `The improvement of level-1 PSA Computer Codes` is divided into two main activities : (1) improvement of level-1 PSA methodology, (2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs.

  6. Improvement of level-1 PSA computer code package

    International Nuclear Information System (INIS)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The improvement of level-1 PSA Computer Codes' is divided into two main activities : 1) improvement of level-1 PSA methodology, 2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs

  7. A Robust Cross Coding Scheme for OFDM Systems

    NARCIS (Netherlands)

    Shao, X.; Slump, Cornelis H.

    2010-01-01

    In wireless OFDM-based systems, coding jointly over all the sub-carriers simultaneously performs better than coding separately per sub-carrier. However, the joint coding is not always optimal because its achievable channel capacity (i.e. the maximum data rate) is inversely proportional to the

  8. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  9. Generalized optical code construction for enhanced and Modified Double Weight like codes without mapping for SAC-OCDMA systems

    Science.gov (United States)

    Kumawat, Soma; Ravi Kumar, M.

    2016-07-01

    Double Weight (DW) code family is one of the coding schemes proposed for Spectral Amplitude Coding-Optical Code Division Multiple Access (SAC-OCDMA) systems. Modified Double Weight (MDW) code for even weights and Enhanced Double Weight (EDW) code for odd weights are two algorithms extending the use of DW code for SAC-OCDMA systems. The above mentioned codes use mapping technique to provide codes for higher number of users. A new generalized algorithm to construct EDW and MDW like codes without mapping for any weight greater than 2 is proposed. A single code construction algorithm gives same length increment, Bit Error Rate (BER) calculation and other properties for all weights greater than 2. Algorithm first constructs a generalized basic matrix which is repeated in a different way to produce the codes for all users (different from mapping). The generalized code is analysed for BER using balanced detection and direct detection techniques.

  10. Quantum Genetics in terms of Quantum Reversible Automata and Quantum Computation of Genetic Codes and Reverse Transcription

    CERN Document Server

    Baianu,I C

    2004-01-01

    The concepts of quantum automata and quantum computation are studied in the context of quantum genetics and genetic networks with nonlinear dynamics. In previous publications (Baianu,1971a, b) the formal concept of quantum automaton and quantum computation, respectively, were introduced and their possible implications for genetic processes and metabolic activities in living cells and organisms were considered. This was followed by a report on quantum and abstract, symbolic computation based on the theory of categories, functors and natural transformations (Baianu,1971b; 1977; 1987; 2004; Baianu et al, 2004). The notions of topological semigroup, quantum automaton, or quantum computer, were then suggested with a view to their potential applications to the analogous simulation of biological systems, and especially genetic activities and nonlinear dynamics in genetic networks. Further, detailed studies of nonlinear dynamics in genetic networks were carried out in categories of n-valued, Lukasiewicz Logic Algebra...

  11. Simplified modeling and code usage in the PASC-3 code system by the introduction of a programming environment

    International Nuclear Information System (INIS)

    Pijlgroms, B.J.; Oppe, J.; Oudshoorn, H.L.; Slobben, J.

    1991-06-01

    A brief description is given of the PASC-3 (Petten-AMPX-SCALE) Reactor Physics code system and associated UNIPASC work environment. The PASC-3 code system is used for criticality and reactor calculations and consists of a selection from the Oak Ridge National Laboratory AMPX-SCALE-3 code collection complemented with a number of additional codes and nuclear data bases. The original codes have been adapted to run under the UNIX operating system. The recommended nuclear data base is a complete 219 group cross section library derived from JEF-1 of which some benchmark results are presented. By the addition of the UNIPASC work environment the usage of the code system is greatly simplified. Complex chains of programs can easily be coupled together to form a single job. In addition, the model parameters can be represented by variables instead of literal values which enhances the readability and may improve the integrity of the code inputs. (author). 8 refs.; 6 figs.; 1 tab

  12. Arabic Natural Language Processing System Code Library

    Science.gov (United States)

    2014-06-01

    Adelphi, MD 20783-1197 This technical note provides a brief description of a Java library for Arabic natural language processing ( NLP ) containing code...for training and applying the Arabic NLP system described in the paper "A Cross-Task Flexible Transition Model for Arabic Tokenization, Affix...and also English) natural language processing ( NLP ), containing code for training and applying the Arabic NLP system described in Stephen Tratz’s

  13. Fast decoding algorithms for coded aperture systems

    International Nuclear Information System (INIS)

    Byard, Kevin

    2014-01-01

    Fast decoding algorithms are described for a number of established coded aperture systems. The fast decoding algorithms for all these systems offer significant reductions in the number of calculations required when reconstructing images formed by a coded aperture system and hence require less computation time to produce the images. The algorithms may therefore be of use in applications that require fast image reconstruction, such as near real-time nuclear medicine and location of hazardous radioactive spillage. Experimental tests confirm the efficacy of the fast decoding techniques

  14. The JAERI code system for evaluation of BWR ECCS performance

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Akimoto, Masayuki; Asahi, Yoshiro; Abe, Kiyoharu; Muramatsu, Ken; Araya, Fumimasa; Sato, Kazuo

    1982-12-01

    Development of respective computer code system of BWR and PWR for evaluation of ECCS has been conducted since 1973 considering the differences of the reactor cooling system, core structure and ECCS. The first version of the BWR code system, of which developmental work started earlier than that of the PWR, has been completed. The BWR code system is designed to provide computational tools to analyze all phases of LOCAs and to evaluate the performance of the ECCS including an ''Evaluation Model (EM)'' feature in compliance with the requirements of the current Japanese Evaluation Guideline of ECCS. The BWR code system could be used for licensing purpose, i.e. for ECCS performance evaluation or audit calculations to cross-examine the methods and results of applicants or vendors. The BWR code system presented in this report comprises several computer codes, each of which analyzes a particular phase of a LOCA or a system blowdown depending on a range of LOCAs, i.e. large and small breaks in a variety of locations in the reactor system. The system includes ALARM-B1, HYDY-B1 and THYDE-B1 for analysis of the system blowdown for various break sizes, THYDE-B-REFLOOD for analysis of the reflood phase and SCORCH-B2 for the calculation of the fuel assembl hot plane temperature. When the multiple codes are used to analyze a broad range of LOCA as stated above, it is very important to evaluate the adequacy and consistency between the codes used to cover an entire break spectrum. The system consistency together with the system performance are discussed for a large commercial BWR. (author)

  15. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  16. Optimisation of electrical system for offshore wind farms via genetic algorithm

    DEFF Research Database (Denmark)

    Chen, Zhe; Zhao, Menghua; Blaabjerg, Frede

    2009-01-01

    An optimisation platform based on genetic algorithm (GA) is presented, where the main components of a wind farm and key technical specifications are used as input parameters and the electrical system design of the wind farm is optimised in terms of both production cost and system reliability....... The power losses, wind power production, initial investment and maintenance costs are considered in the production cost. The availability of components and network redundancy are included in the reliability evaluation. The method of coding an electrical system to a binary string, which is processed by GA......, is developed. Different GA techniques are investigated based on a real example offshore wind farm. This optimisation platform has been demonstrated as a powerful tool for offshore wind farm design and evaluation....

  17. Error-correction coding for digital communications

    Science.gov (United States)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  18. Confidentiality of 2D Code using Infrared with Cell-level Error Correction

    Directory of Open Access Journals (Sweden)

    Nobuyuki Teraura

    2013-03-01

    Full Text Available Optical information media printed on paper use printing materials to absorb visible light. There is a 2D code, which may be encrypted but also can possibly be copied. Hence, we envisage an information medium that cannot possibly be copied and thereby offers high security. At the surface, the normal 2D code is printed. The inner layers consist of 2D codes printed using a variety of materials, which absorb certain distinct wavelengths, to form a multilayered 2D code. Information can be distributed among the 2D codes forming the inner layers of the multiplex. Additionally, error correction at cell level can be introduced.

  19. Locally Minimum Storage Regenerating Codes in Distributed Cloud Storage Systems

    Institute of Scientific and Technical Information of China (English)

    Jing Wang; Wei Luo; Wei Liang; Xiangyang Liu; Xiaodai Dong

    2017-01-01

    In distributed cloud storage sys-tems, inevitably there exist multiple node fail-ures at the same time. The existing methods of regenerating codes, including minimum storage regenerating (MSR) codes and mini-mum bandwidth regenerating (MBR) codes, are mainly to repair one single or several failed nodes, unable to meet the repair need of distributed cloud storage systems. In this paper, we present locally minimum storage re-generating (LMSR) codes to recover multiple failed nodes at the same time. Specifically, the nodes in distributed cloud storage systems are divided into multiple local groups, and in each local group (4, 2) or (5, 3) MSR codes are constructed. Moreover, the grouping method of storage nodes and the repairing process of failed nodes in local groups are studied. The-oretical analysis shows that LMSR codes can achieve the same storage overhead as MSR codes. Furthermore, we verify by means of simulation that, compared with MSR codes, LMSR codes can reduce the repair bandwidth and disk I/O overhead effectively.

  20. SWAT3.1 - the integrated burnup code system driving continuous energy Monte Carlo codes MVP and MCNP

    International Nuclear Information System (INIS)

    Suyama, Kenya; Mochizuki, Hiroki; Takada, Tomoyuki; Ryufuku, Susumu; Okuno, Hiroshi; Murazaki, Minoru; Ohkubo, Kiyoshi

    2009-05-01

    Integrated burnup calculation code system SWAT is a system that combines neutronics calculation code SRAC,which is widely used in Japan, and point burnup calculation code ORIGEN2. It has been used to evaluate the composition of the uranium, plutonium, minor actinides and the fission products in the spent nuclear fuel. Based on this idea, the integrated burnup calculation code system SWAT3.1 was developed by combining the continuous energy Monte Carlo code MVP and MCNP, and ORIGEN2. This enables us to treat the arbitrary fuel geometry and to generate the effective cross section data to be used in the burnup calculation with few approximations. This report describes the outline, input data instruction and several examples of the calculation. (author)

  1. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  2. JEMs and incompatible occupational coding systems: Effect of manual and automatic recoding of job codes on exposure assignment

    NARCIS (Netherlands)

    Koeman, T.; Offermans, N.S.M.; Christopher-De Vries, Y.; Slottje, P.; Brandt, P.A. van den; Goldbohm, R.A.; Kromhout, H.; Vermeulen, R.

    2013-01-01

    Background: In epidemiological studies, occupational exposure estimates are often assigned through linkage of job histories to job-exposure matrices (JEMs). However, available JEMs may have a coding system incompatible with the coding system used to code the job histories, necessitating a

  3. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    Science.gov (United States)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  4. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  5. Improved decoding for a concatenated coding system

    DEFF Research Database (Denmark)

    Paaske, Erik

    1990-01-01

    The concatenated coding system recommended by CCSDS (Consultative Committee for Space Data Systems) uses an outer (255,233) Reed-Solomon (RS) code based on 8-b symbols, followed by the block interleaver and an inner rate 1/2 convolutional code with memory 6. Viterbi decoding is assumed. Two new...... decoding procedures based on repeated decoding trials and exchange of information between the two decoders and the deinterleaver are proposed. In the first one, where the improvement is 0.3-0.4 dB, only the RS decoder performs repeated trials. In the second one, where the improvement is 0.5-0.6 dB, both...... decoders perform repeated decoding trials and decoding information is exchanged between them...

  6. Performance Analysis of Optical Code Division Multiplex System

    Science.gov (United States)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  7. Delay Estimation in Long-Code Asynchronous DS/CDMA Systems Using Multiple Antennas

    Directory of Open Access Journals (Sweden)

    Sirbu Marius

    2004-01-01

    Full Text Available The problem of propagation delay estimation in asynchronous long-code DS-CDMA multiuser systems is addressed. Almost all the methods proposed so far in the literature for propagation delay estimation are derived for short codes and the knowledge of the codes is exploited by the estimators. In long-code CDMA, the spreading code is aperiodic and the methods developed for short codes may not be used or may increase the complexity significantly. For example, in the subspace-based estimators, the aperiodic nature of the code may require subspace tracking. In this paper we propose a novel method for simultaneous estimation of the propagation delays of several active users. A specific multiple-input multiple-output (MIMO system model is constructed in a multiuser scenario. In such model the channel matrix contains information about both the users propagation delays and channel impulse responses. Consequently, estimates of the delays are obtained as a by-product of the channel estimation task. The channel matrix has a special structure that is exploited in estimating the delays. The proposed delay estimation method lends itself to an adaptive implementation. Thus, it may be applied to joint channel and delay estimation in uplink DS-CDMA analogously to the method presented by the authors in 2003. The performance of the proposed method is studied in simulation using realistic time-varying channel model and different SNR levels in the face of near-far effects, and using low spreading factor (high data rates.

  8. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  9. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  10. Photoactivatable Mussel-Based Underwater Adhesive Proteins by an Expanded Genetic Code.

    Science.gov (United States)

    Hauf, Matthias; Richter, Florian; Schneider, Tobias; Faidt, Thomas; Martins, Berta M; Baumann, Tobias; Durkin, Patrick; Dobbek, Holger; Jacobs, Karin; Möglich, Andreas; Budisa, Nediljko

    2017-09-19

    Marine mussels exhibit potent underwater adhesion abilities under hostile conditions by employing 3,4-dihydroxyphenylalanine (DOPA)-rich mussel adhesive proteins (MAPs). However, their recombinant production is a major biotechnological challenge. Herein, a novel strategy based on genetic code expansion has been developed by engineering efficient aminoacyl-transfer RNA synthetases (aaRSs) for the photocaged noncanonical amino acid ortho-nitrobenzyl DOPA (ONB-DOPA). The engineered ONB-DOPARS enables in vivo production of MAP type 5 site-specifically equipped with multiple instances of ONB-DOPA to yield photocaged, spatiotemporally controlled underwater adhesives. Upon exposure to UV light, these proteins feature elevated wet adhesion properties. This concept offers new perspectives for the production of recombinant bioadhesives. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Development of a nuclear power plant system analysis code

    International Nuclear Information System (INIS)

    Sim, Suk K.; Jeong, J. J.; Ha, K. S.; Moon, S. K.; Park, J. W.; Yang, S. K.; Song, C. H.; Chun, S. Y.; Kim, H. C.; Chung, B. D.; Lee, W. J.; Kwon, T. S.

    1997-07-01

    During the period of this study, TASS 1.0 code has been prepared for the non-LOCA licensing and reload safety analyses of the Westinghouse and the Korean Standard Nuclear Power Plants (KSNPP) type reactors operating in Korea. TASS-NPA also has been developed for a real time simulation of the Kori-3/4 transients using on-line graphical interactions. TASS 2.0 code has been further developed to timely apply the TASS 2.0 code for the design certification of the KNGR. The COBRA/RELAP5 code, a multi-dimensional best estimate system code, has been developed by integrating the realistic three-dimensional reactor vessel model with the RELAP5 /MOD3.2 code, a one-dimensional system code. Also, a 3D turbulent two-phase flow analysis code, FEMOTH-TF, has been developed using finite element technique to analyze local thermal hydraulic phenomena in support of the detailed design analysis for the development of the advanced reactors. (author). 84 refs., 27 tabs., 83 figs

  12. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  13. Software coding for reliable data communication in a reactor safety system

    International Nuclear Information System (INIS)

    Maghsoodi, R.

    1978-01-01

    A software coding method is proposed to improve the communication reliability of a microprocessor based fast-reactor safety system. This method which replaces the conventional coding circuitry, applies a program to code the data which is communicated between the processors via their data memories. The system requirements are studied and the suitable codes are suggested. The problems associated with hardware coders, and the advantages of software coding methods are discussed. The product code which proves a faster coding time over the cyclic code is chosen as the final code. Then the improvement of the communication reliability is derived for a processor and its data memory. The result is used to calculate the reliability improvement of the processing channel as the basic unit for the safety system. (author)

  14. A coupled systems code-CFD MHD solver for fusion blanket design

    Energy Technology Data Exchange (ETDEWEB)

    Wolfendale, Michael J., E-mail: m.wolfendale11@imperial.ac.uk; Bluck, Michael J.

    2015-10-15

    Highlights: • A coupled systems code-CFD MHD solver for fusion blanket applications is proposed. • Development of a thermal hydraulic systems code with MHD capabilities is detailed. • A code coupling methodology based on the use of TCP socket communications is detailed. • Validation cases are briefly discussed for the systems code and coupled solver. - Abstract: The network of flow channels in a fusion blanket can be modelled using a 1D thermal hydraulic systems code. For more complex components such as junctions and manifolds, the simplifications employed in such codes can become invalid, requiring more detailed analyses. For magnetic confinement reactor blanket designs using a conducting fluid as coolant/breeder, the difficulties in flow modelling are particularly severe due to MHD effects. Blanket analysis is an ideal candidate for the application of a code coupling methodology, with a thermal hydraulic systems code modelling portions of the blanket amenable to 1D analysis, and CFD providing detail where necessary. A systems code, MHD-SYS, has been developed and validated against existing analyses. The code shows good agreement in the prediction of MHD pressure loss and the temperature profile in the fluid and wall regions of the blanket breeding zone. MHD-SYS has been coupled to an MHD solver developed in OpenFOAM and the coupled solver validated for test geometries in preparation for modelling blanket systems.

  15. Integrated burnup calculation code system SWAT

    International Nuclear Information System (INIS)

    Suyama, Kenya; Hirakawa, Naohiro; Iwasaki, Tomohiko.

    1997-11-01

    SWAT is an integrated burnup code system developed for analysis of post irradiation examination, transmutation of radioactive waste, and burnup credit problem. It enables us to analyze the burnup problem using neutron spectrum depending on environment of irradiation, combining SRAC which is Japanese standard thermal reactor analysis code system and ORIGEN2 which is burnup code widely used all over the world. SWAT makes effective cross section library based on results by SRAC, and performs the burnup analysis with ORIGEN2 using that library. SRAC and ORIGEN2 can be called as external module. SWAT has original cross section library on based JENDL-3.2 and libraries of fission yield and decay data prepared from JNDC FP Library second version. Using these libraries, user can use latest data in the calculation of SWAT besides the effective cross section prepared by SRAC. Also, User can make original ORIGEN2 library using the output file of SWAT. This report presents concept and user's manual of SWAT. (author)

  16. Development and verification of a coupled code system RETRAN-MASTER-TORC

    International Nuclear Information System (INIS)

    Cho, J.Y.; Song, J.S.; Joo, H.G.; Zee, S.Q.

    2004-01-01

    Recently, coupled thermal-hydraulics (T-H) and three-dimensional kinetics codes have been widely used for the best-estimate simulations such as the main steam line break (MSLB) and locked rotor problems. This work is to develop and verify one of such codes by coupling the system T-H code RETRAN, the 3-D kinetics code MASTER and sub-channel analysis code TORC. The MASTER code has already been applied to such simulations after coupling with the MARS or RETRAN-3D multi-dimensional system T-H codes. The MASTER code contains a sub-channel analysis code COBRA-III C/P, and the coupled systems MARSMASTER-COBRA and RETRAN-MASTER-COBRA had been already developed and verified. With these previous studies, a new coupled system of RETRAN-MASTER-TORC is to be developed and verified for the standard best-estimate simulation code package in Korea. The TORC code has already been applied to the thermal hydraulics design of the several ABB/CE type plants and Korean Standard Nuclear Power Plants (KSNP). This justifies the choice of TORC rather than COBRA. Because the coupling between RETRAN and MASTER codes are already established and verified, this work is simplified to couple the TORC sub-channel T-H code with the MASTER neutronics code. The TORC code is a standalone code that solves the T-H equations for a given core problem from reading the input file and finally printing the converged solutions. However, in the coupled system, because TORC receives the pin power distributions from the neutronics code MASTER and transfers the T-H results to MASTER iteratively, TORC needs to be controlled by the MASTER code and does not need to solve the given problem completely at each iteration step. By this reason, the coupling of the TORC code with the MASTER code requires several modifications in the I/O treatment, flow iteration and calculation logics. The next section of this paper describes the modifications in the TORC code. The TORC control logic of the MASTER code is then followed. The

  17. GRABGAM: A Gamma Analysis Code for Ultra-Low-Level HPGe SPECTRA

    Energy Technology Data Exchange (ETDEWEB)

    Winn, W.G.

    1999-07-28

    The GRABGAM code has been developed for analysis of ultra-low-level HPGe gamma spectra. The code employs three different size filters for the peak search, where the largest filter provides best sensitivity for identifying low-level peaks and the smallest filter has the best resolution for distinguishing peaks within a multiplet. GRABGAM basically generates an integral probability F-function for each singlet or multiplet peak analysis, bypassing the usual peak fitting analysis for a differential f-function probability model. Because F is defined by the peak data, statistical limitations for peak fitting are avoided; however, the F-function does provide generic values for peak centroid, full width at half maximum, and tail that are consistent with a Gaussian formalism. GRABGAM has successfully analyzed over 10,000 customer samples, and it interfaces with a variety of supplementary codes for deriving detector efficiencies, backgrounds, and quality checks.

  18. Low-level waste shallow burial assessment code

    International Nuclear Information System (INIS)

    Fields, D.E.; Little, C.A.; Emerson, C.J.

    1981-01-01

    PRESTO (Prediction of Radiation Exposures from Shallow Trench Operationns) is a computer code developed under United States Environmental Protection Agency funding to evaluate possible health effects from radionuclide releases from shallow, radioctive-waste disposal trenches and from areas contaminated with operational spillage. The model is intended to predict radionuclide transport and the ensuing exposure and health impact to a stable, local population for a 1000-year period following closure of the burial grounds. Several classes of submodels are used in PRESTO to represent scheduled events, unit system responses, and risk evaluation processes. The code is modular to permit future expansion and refinement. Near-surface transport mechanisms considered in the PRESTO code are cap failure, cap erosion, farming or reclamation practices, human intrusion, chemical exchange within an active surface soil layer, contamination from trench overflow, and dilution by surface streams. Subsurface processes include infiltration and drainage into the trench, the ensuing solubilization of radionuclides, and chemical exchange between trench water and buried solids. Mechanisms leading to contaminated outflow include trench overflow and downwad vertical percolation. If the latter outflow reaches an aquifer, radiological exposure from irrigation or domestic consumption is considered. Airborne exposure terms are evaluated using the Gaussian plume atmospheric transport formulation as implemented by Fields and Miller

  19. Coupling Legacy and Contemporary Deterministic Codes to Goldsim for Probabilistic Assessments of Potential Low-Level Waste Repository Sites

    Science.gov (United States)

    Mattie, P. D.; Knowlton, R. G.; Arnold, B. W.; Tien, N.; Kuo, M.

    2006-12-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in radioactive waste disposal and is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. International technology transfer efforts are often hampered by small budgets, time schedule constraints, and a lack of experienced personnel in countries with small radioactive waste disposal programs. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, re-vitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a creditable and solid computational platform for constructing probabilistic safety assessment models. External model linkage capabilities in Goldsim and the techniques applied to facilitate this process will be presented using example applications, including Breach, Leach, and Transport-Multiple Species (BLT-MS), a U.S. NRC sponsored code simulating release and transport of contaminants from a subsurface low-level waste disposal facility used in a cooperative technology transfer

  20. Genetic variants in CHI3L1 influencing YKL-40 levels

    DEFF Research Database (Denmark)

    Kjaergaard, Alisa D; Johansen, Julia S; Nordestgaard, Børge G

    2013-01-01

    Despite its important role in many serious diseases, the genetic background for plasma YKL-40 has still not been systematically catalogued. Therefore, we aimed at identifying genetic variants in CHI3L1 influencing plasma YKL-40 levels in the general population.......Despite its important role in many serious diseases, the genetic background for plasma YKL-40 has still not been systematically catalogued. Therefore, we aimed at identifying genetic variants in CHI3L1 influencing plasma YKL-40 levels in the general population....

  1. Novel base-pairing interactions at the tRNA wobble position crucial for accurate reading of the genetic code

    Science.gov (United States)

    Rozov, Alexey; Demeshkina, Natalia; Khusainov, Iskander; Westhof, Eric; Yusupov, Marat; Yusupova, Gulnara

    2016-01-01

    Posttranscriptional modifications at the wobble position of transfer RNAs play a substantial role in deciphering the degenerate genetic code on the ribosome. The number and variety of modifications suggest different mechanisms of action during messenger RNA decoding, of which only a few were described so far. Here, on the basis of several 70S ribosome complex X-ray structures, we demonstrate how Escherichia coli tRNALysUUU with hypermodified 5-methylaminomethyl-2-thiouridine (mnm5s2U) at the wobble position discriminates between cognate codons AAA and AAG, and near-cognate stop codon UAA or isoleucine codon AUA, with which it forms pyrimidine-pyrimidine mismatches. We show that mnm5s2U forms an unusual pair with guanosine at the wobble position that expands general knowledge on the degeneracy of the genetic code and specifies a powerful role of tRNA modifications in translation. Our models consolidate the translational fidelity mechanism proposed previously where the steric complementarity and shape acceptance dominate the decoding mechanism.

  2. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  3. User's manual of SECOM2: a computer code for seismic system reliability analysis

    International Nuclear Information System (INIS)

    Uchiyama, Tomoaki; Oikawa, Tetsukuni; Kondo, Masaaki; Tamura, Kazuo

    2002-03-01

    This report is the user's manual of seismic system reliability analysis code SECOM2 (Seismic Core Melt Frequency Evaluation Code Ver.2) developed at the Japan Atomic Energy Research Institute for systems reliability analysis, which is one of the tasks of seismic probabilistic safety assessment (PSA) of nuclear power plants (NPPs). The SECOM2 code has many functions such as: Calculation of component failure probabilities based on the response factor method, Extraction of minimal cut sets (MCSs), Calculation of conditional system failure probabilities for given seismic motion levels at the site of an NPP, Calculation of accident sequence frequencies and the core damage frequency (CDF) with use of the seismic hazard curve, Importance analysis using various indicators, Uncertainty analysis, Calculation of the CDF taking into account the effect of the correlations of responses and capacities of components, and Efficient sensitivity analysis by changing parameters on responses and capacities of components. These analyses require the fault tree (FT) representing the occurrence condition of the system failures and core damage, information about response and capacity of components and seismic hazard curve for the NPP site as inputs. This report presents the models and methods applied in the SECOM2 code and how to use those functions. (author)

  4. Study on a new meteorological sampling scheme developed for the OSCAAR code system

    International Nuclear Information System (INIS)

    Liu Xinhe; Tomita, Kenichi; Homma, Toshimitsu

    2002-03-01

    One important step in Level-3 Probabilistic Safety Assessment is meteorological sequence sampling, on which the previous studies were mainly related to code systems using the straight-line plume model and more efforts are needed for those using the trajectory puff model such as the OSCAAR code system. This report describes the development of a new meteorological sampling scheme for the OSCAAR code system that explicitly considers population distribution. A group of principles set for the development of this new sampling scheme includes completeness, appropriate stratification, optimum allocation, practicability and so on. In this report, discussions are made about the procedures of the new sampling scheme and its application. The calculation results illustrate that although it is quite difficult to optimize stratification of meteorological sequences based on a few environmental parameters the new scheme do gather the most inverse conditions in a single subset of meteorological sequences. The size of this subset may be as small as a few dozens, so that the tail of a complementary cumulative distribution function is possible to remain relatively static in different trials of the probabilistic consequence assessment code. (author)

  5. Evaluation of the analysis models in the ASTRA nuclear design code system

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Nam Jin; Park, Chang Jea; Kim, Do Sam; Lee, Kyeong Taek; Kim, Jong Woon [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2000-11-15

    In the field of nuclear reactor design, main practice was the application of the improved design code systems. During the process, a lot of basis and knowledge were accumulated in processing input data, nuclear fuel reload design, production and analysis of design data, et al. However less efforts were done in the analysis of the methodology and in the development or improvement of those code systems. Recently, KEPO Nuclear Fuel Company (KNFC) developed the ASTRA (Advanced Static and Transient Reactor Analyzer) code system for the purpose of nuclear reactor design and analysis. In the code system, two group constants were generated from the CASMO-3 code system. The objective of this research is to analyze the analysis models used in the ASTRA/CASMO-3 code system. This evaluation requires indepth comprehension of the models, which is important so much as the development of the code system itself. Currently, most of the code systems used in domestic Nuclear Power Plant were imported, so it is very difficult to maintain and treat the change of the situation in the system. Therefore, the evaluation of analysis models in the ASTRA nuclear reactor design code system in very important.

  6. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  7. A study on the nuclear computer code maintenance and management system

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Huh, Young Hwan; Lee, Jong Bok; Choi, Young Gil; Suh, Soong Hyok; Kang, Byong Heon; Kim, Hee Kyung; Kim, Ko Ryeo; Park, Soo Jin

    1990-12-01

    According to current software development and quality assurance trends. It is necessary to develop computer code management system for nuclear programs. For this reason, the project started in 1987. Main objectives of the project are to establish a nuclear computer code management system, to secure software reliability, and to develop nuclear computer code packages. Contents of performing the project in this year were to operate and maintain computer code information system of KAERI computer codes, to develop application tool, AUTO-i, for solving the 1st and 2nd moments of inertia on polygon or circle, and to research nuclear computer code conversion between different machines. For better supporting the nuclear code availability and reliability, assistance from users who are using codes is required. Lastly, for easy reference about the codes information, we presented list of code names and information on the codes which were introduced or developed during this year. (Author)

  8. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    Energy Technology Data Exchange (ETDEWEB)

    Binh, Do Quang [University of Technical Education Ho Chi Minh City (Viet Nam); Huy, Ngo Quang [University of Industry Ho Chi Minh City (Viet Nam); Hai, Nguyen Hoang [Centre for Research and Development of Radiation Technology, Ho Chi Minh City (Viet Nam)

    2014-12-15

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  9. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    International Nuclear Information System (INIS)

    Binh, Do Quang; Huy, Ngo Quang; Hai, Nguyen Hoang

    2014-01-01

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  10. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  11. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  12. High-level synthesis for reduction of WCET in real-time systems

    DEFF Research Database (Denmark)

    Kristensen, Andreas Toftegaard; Pezzarossa, Luca; Sparsø, Jens

    2017-01-01

    . Compared to executing the high-level language code on a processor, HLS can be used to create hardware that accelerates critical parts of the code. When discussing performance in the context or real-time systems, it is the worst-case execution time (WCET) of a task that matters. WCET obviously benefits from...... hardware acceleration, but it may also benefit from a tighter bound on the WCET. This paper explores the use of and integration of accelerators generated using HLS into a time-predictable processor intended for real-time systems. The high-level design tool, Vivado HLS, is used to generate hardware...

  13. Implementing a modular system of computer codes

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.

    1983-07-01

    A modular computation system has been developed for nuclear reactor core analysis. The codes can be applied repeatedly in blocks without extensive user input data, as needed for reactor history calculations. The primary control options over the calculational paths and task assignments within the codes are blocked separately from other instructions, admitting ready access by user input instruction or directions from automated procedures and promoting flexible and diverse applications at minimum application cost. Data interfacing is done under formal specifications with data files manipulated by an informed manager. This report emphasizes the system aspects and the development of useful capability, hopefully informative and useful to anyone developing a modular code system of much sophistication. Overall, this report in a general way summarizes the many factors and difficulties that are faced in making reactor core calculations, based on the experience of the authors. It provides the background on which work on HTGR reactor physics is being carried out

  14. Code-modulated interferometric imaging system using phased arrays

    Science.gov (United States)

    Chauhan, Vikas; Greene, Kevin; Floyd, Brian

    2016-05-01

    Millimeter-wave (mm-wave) imaging provides compelling capabilities for security screening, navigation, and bio- medical applications. Traditional scanned or focal-plane mm-wave imagers are bulky and costly. In contrast, phased-array hardware developed for mass-market wireless communications and automotive radar promise to be extremely low cost. In this work, we present techniques which can allow low-cost phased-array receivers to be reconfigured or re-purposed as interferometric imagers, removing the need for custom hardware and thereby reducing cost. Since traditional phased arrays power combine incoming signals prior to digitization, orthogonal code-modulation is applied to each incoming signal using phase shifters within each front-end and two-bit codes. These code-modulated signals can then be combined and processed coherently through a shared hardware path. Once digitized, visibility functions can be recovered through squaring and code-demultiplexing operations. Pro- vided that codes are selected such that the product of two orthogonal codes is a third unique and orthogonal code, it is possible to demultiplex complex visibility functions directly. As such, the proposed system modulates incoming signals but demodulates desired correlations. In this work, we present the operation of the system, a validation of its operation using behavioral models of a traditional phased array, and a benchmarking of the code-modulated interferometer against traditional interferometer and focal-plane arrays.

  15. Chromatin remodeling: the interface between extrinsic cues and the genetic code?

    Science.gov (United States)

    Ezzat, Shereen

    2008-10-01

    The successful completion of the human genome project ushered a new era of hope and skepticism. However, the promise of finding the fundamental basis of human traits and diseases appears less than fulfilled. The original premise was that the DNA sequence of every gene would allow precise characterization of critical differences responsible for altered cellular functions. The characterization of intragenic mutations in cancers paved the way for early screening and the design of targeted therapies. However, it has also become evident that unmasking genetic codes alone cannot explain the diversity of disease phenotypes within a population. Further, classic genetics has not been able to explain the differences that have been observed among identical twins or even cloned animals. This new reality has re-ignited interest in the field of epigenetics. While traditionally defined as heritable changes that can alter gene expression without affecting the corresponding DNA sequence, this definition has come into question. The extent to which epigenetic change can also be acquired in response to chemical stimuli represents an exciting dimension in the "nature vs nurture" debate. In this review I will describe a series of studies in my laboratory that illustrate the significance of epigenetics and its potential clinical implications.

  16. Confidence level in the calculations of HCDA consequences using large codes

    International Nuclear Information System (INIS)

    Nguyen, D.H.; Wilburn, N.P.

    1979-01-01

    The probabilistic approach to nuclear reactor safety is playing an increasingly significant role. For the liquid-metal fast breeder reactor (LMFBR) in particular, the ultimate application of this approach could be to determine the probability of achieving the goal of a specific line-of-assurance (LOA). Meanwhile a more pressing problem is one of quantifying the uncertainty in a calculated consequence for hypothetical core disruptive accident (HCDA) using large codes. Such uncertainty arises from imperfect modeling of phenomenology and/or from inaccuracy in input data. A method is presented to determine the confidence level in consequences calculated by a large computer code due to the known uncertainties in input invariables. A particular application was made to the initial time of pin failure in a transient overpower HCDA calculated by the code MELT-IIIA in order to demonstrate the method. A probability distribution function (pdf) for the time of failure was first constructed, then the confidence level for predicting this failure parameter within a desired range was determined

  17. Optical code division multiple access secure communications systems with rapid reconfigurable polarization shift key user code

    Science.gov (United States)

    Gao, Kaiqiang; Wu, Chongqing; Sheng, Xinzhi; Shang, Chao; Liu, Lanlan; Wang, Jian

    2015-09-01

    An optical code division multiple access (OCDMA) secure communications system scheme with rapid reconfigurable polarization shift key (Pol-SK) bipolar user code is proposed and demonstrated. Compared to fix code OCDMA, by constantly changing the user code, the performance of anti-eavesdropping is greatly improved. The Pol-SK OCDMA experiment with a 10 Gchip/s user code and a 1.25 Gb/s user data of payload has been realized, which means this scheme has better tolerance and could be easily realized.

  18. Near-Capacity Coding for Discrete Multitone Systems with Impulse Noise

    Directory of Open Access Journals (Sweden)

    Kschischang Frank R

    2006-01-01

    Full Text Available We consider the design of near-capacity-achieving error-correcting codes for a discrete multitone (DMT system in the presence of both additive white Gaussian noise and impulse noise. Impulse noise is one of the main channel impairments for digital subscriber lines (DSL. One way to combat impulse noise is to detect the presence of the impulses and to declare an erasure when an impulse occurs. In this paper, we propose a coding system based on low-density parity-check (LDPC codes and bit-interleaved coded modulation that is capable of taking advantage of the knowledge of erasures. We show that by carefully choosing the degree distribution of an irregular LDPC code, both the additive noise and the erasures can be handled by a single code, thus eliminating the need for an outer code. Such a system can perform close to the capacity of the channel and for the same redundancy is significantly more immune to the impulse noise than existing methods based on an outer Reed-Solomon (RS code. The proposed method has a lower implementation complexity than the concatenated coding approach.

  19. A Systematic Review of Coding Systems Used in Pharmacoepidemiology and Database Research.

    Science.gov (United States)

    Chen, Yong; Zivkovic, Marko; Wang, Tongtong; Su, Su; Lee, Jianyi; Bortnichak, Edward A

    2018-02-01

    Clinical coding systems have been developed to translate real-world healthcare information such as prescriptions, diagnoses and procedures into standardized codes appropriate for use in large healthcare datasets. Due to the lack of information on coding system characteristics and insufficient uniformity in coding practices, there is a growing need for better understanding of coding systems and their use in pharmacoepidemiology and observational real world data research. To determine: 1) the number of available coding systems and their characteristics, 2) which pharmacoepidemiology databases are they adopted in, 3) what outcomes and exposures can be identified from each coding system, and 4) how robust they are with respect to consistency and validity in pharmacoepidemiology and observational database studies. Electronic literature database and unpublished literature searches, as well as hand searching of relevant journals were conducted to identify eligible articles discussing characteristics and applications of coding systems in use and published in the English language between 1986 and 2016. Characteristics considered included type of information captured by codes, clinical setting(s) of use, adoption by a pharmacoepidemiology database, region, and available mappings. Applications articles describing the use and validity of specific codes, code lists, or algorithms were also included. Data extraction was performed independently by two reviewers and a narrative synthesis was performed. A total of 897 unique articles and 57 coding systems were identified, 17% of which included country-specific modifications or multiple versions. Procedures (55%), diagnoses (36%), drugs (38%), and site of disease (39%) were most commonly and directly captured by these coding systems. The systems were used to capture information from the following clinical settings: inpatient (63%), ambulatory (55%), emergency department (ED, 34%), and pharmacy (13%). More than half of all coding

  20. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  1. Final technical position on documentation of computer codes for high-level waste management

    International Nuclear Information System (INIS)

    Silling, S.A.

    1983-06-01

    Guidance is given for the content of documentation of computer codes which are used in support of a license application for high-level waste disposal. The guidelines cover theoretical basis, programming, and instructions for use of the code

  2. Geometrical Approach to the Grid System in the KOPEC Pilot Code

    International Nuclear Information System (INIS)

    Lee, E. J.; Park, C. E.; Lee, S. Y.

    2008-01-01

    KOPEC has been developing a pilot code to analyze two phase flow. The earlier version of the pilot code adopts the geometry with one-dimensional structured mesh system. As the pilot code is required to handle more complex geometries, a systematic geometrical approach to grid system has been introduced. Grid system can be classified as two types; structured grid system and unstructured grid system. The structured grid system is simple to apply but is less flexible than the other. The unstructured grid system is more complicated than the structured grid system. But it is more flexible to model the geometry. Therefore, two types of grid systems are utilized to allow code users simplicity as well as the flexibility

  3. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  4. The standard genetic code and its relation to mutational pressure: robustness and equilibrium criteria

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Martinez Ortiz, Carlos; Sautie Castellanos, Miguel; Valdes, Kiria; Guevara Erra, Ramon

    2004-10-01

    Under the assumption of even point mutation pressure on the DNA strand, rates for transitions from one amino acid into another were assessed. Nearly 25% of all mutations were silent. About 48% of the mutations from a given amino acid stream either into the same amino acid or into an amino acid of the same class. These results suggest a great stability of the Standard Genetic Code respect to mutation load. Concepts from chemical equilibrium theory are applicable into this case provided that mutation rate constants are given. It was obtained that unequal synonymic codon usage may lead to changes in the equilibrium concentrations. Data from real biological species showed that several amino acids are close to the respective equilibrium concentration. However in all the cases the concentration of leucine nearly doubled its equilibrium concentration, whereas for the stop command (Term) it was about 10 times lower. The overall distance from equilibrium for a set of species suggests that eukaryotes are closer to equilibrium than prokaryotes, and the HIV virus was closest to equilibrium among 15 species. We obtained that contemporary species are closer to the equilibrium than the Last Universal Common Ancestor (LUCA) was. Similarly, nonpreserved regions in proteins are closer to equilibrium than the preserved ones. We suggest that this approach can be useful for exploring some aspects of biological evolution in the framework of Standard Genetic Code properties. (author)

  5. An Order Coding Genetic Algorithm to Optimize Fuel Reloads in a Nuclear Boiling Water Reactor

    International Nuclear Information System (INIS)

    Ortiz, Juan Jose; Requena, Ignacio

    2004-01-01

    A genetic algorithm is used to optimize the nuclear fuel reload for a boiling water reactor, and an order coding is proposed for the chromosomes and appropriate crossover and mutation operators. The fitness function was designed so that the genetic algorithm creates fuel reloads that, on one hand, satisfy the constrictions for the radial power peaking factor, the minimum critical power ratio, and the maximum linear heat generation rate while optimizing the effective multiplication factor at the beginning and end of the cycle. To find the values of these variables, a neural network trained with the behavior of a reactor simulator was used to predict them. The computation time is therefore greatly decreased in the search process. We validated this method with data from five cycles of the Laguna Verde Nuclear Power Plant in Mexico

  6. Summary description of the scale modular code system

    International Nuclear Information System (INIS)

    Parks, C.V.

    1987-12-01

    SCALE - a modular code system for Standardized Computer Analyses for Licensing Evaluation - has been developed at Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission staff. The SCALE system utilizes well-established computer codes and methods within standard analytic sequences that allow simplified free-form input, automate the data processing and coupling between codes, and provide accurate and reliable results. System development has been directed at criticality safety, shielding, and heat transfer analysis of spent fuel transport and/or storage casks. However, only a few of the sequences (and none of the individual functional modules) are restricted to cask applications. This report will provide a background on the history of the SCALE development and review the components and their function within the system. The available data libraries are also discussed, together with the automated features that standardize the data processing and systems analysis. 83 refs., 32 figs., 11 tabs

  7. Modular Modeling System (MMS) code: a versatile power plant analysis package

    International Nuclear Information System (INIS)

    Divakaruni, S.M.; Wong, F.K.L.

    1987-01-01

    The basic version of the Modular Modeling System (MMS-01), a power plant systems analysis computer code jointly developed by the Nuclear Power and the Coal Combustion Systems Divisions of the Electric Power Research Institute (EPRI), has been released to the utility power industry in April 1983 at a code release workshop held in Charlotte, North Carolina. Since then, additional modules have been developed to analyze the Pressurized Water Reactors (PWRs) and the Boiling Water Reactors (BWRs) when the safety systems are activated. Also, a selected number of modules in the MMS-01 library have been modified to allow the code users more flexibility in constructing plant specific systems for analysis. These new PWR and BWR modules constitute the new MMS library, and it includes the modifications to the MMS-01 library. A year and half long extensive code qualification program of this new version of the MMS code at EPRI and the contractor sites, back by further code testing in an user group environment is culminating in the MMS-02 code release announcement seminar. At this seminar, the results of user group efforts and the code qualification program will be presented in a series of technical sessions. A total of forty-nine papers will be presented to describe the new code features and the code qualification efforts. For the sake of completion, an overview of the code is presented to include the history of the code development, description of the MMS code and its structure, utility engineers involvement in MMS-01 and MMS-02 validations, the enhancements made in the last 18 months to the code, and finally the perspective on the code future in the fossil and nuclear industry

  8. Development of computing code system for level 3 PSA

    International Nuclear Information System (INIS)

    Jeong, Jong Tae; Yu, Dong Han; Kim, Seung Hwan.

    1997-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated through wind tunnel experiment. These results will give a physical insight in the development of a new dispersion model. Because there are some discrepancies between the results from Gaussian plume model and those from field test, the effect of terrain on the atmospheric dispersion was investigated by using CTDMPLUS code. Through this study we find that the model which can treat terrain effect is essential in the atmospheric dispersion of radioactive materials and the CTDMPLUS model can be used as a useful tool. And it is suggested that modification of a model and experimental study should be made through the continuous effort. The health effect assessment near the Yonggwang site by using IPE (Individual plant examination) results and its site data was performed. The health effect assessment is an important part of consequence analysis of a nuclear power plant site. The MACCS was used in the assessment. Based on the calculation of CCDF for each risk measure, it is shown that CCDF has a slow slope and thus wide probability distribution in cases of early fatality, early injury, total early fatality risk, and total weighted early fatality risk. And in cases of cancer fatality and population dose within 48km and 80km, the CCDF curve have a steep slope and thus narrow probability distribution. The establishment of methodologies for necessary models for consequence analysis resulting form a server accident in the nuclear power plant was made and a program for consequence analysis was developed. The models include atmospheric transport and diffusion, calculation of exposure doses for various pathways, and assessment of health effects and associated risks. Finally, the economic impact resulting form an accident in a nuclear power plant was investigated. In this study, estimation models for each cost terms that considered in economic

  9. Development of computing code system for level 3 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong Tae; Yu, Dong Han; Kim, Seung Hwan

    1997-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated through wind tunnel experiment. These results will give a physical insight in the development of a new dispersion model. Because there are some discrepancies between the results from Gaussian plume model and those from field test, the effect of terrain on the atmospheric dispersion was investigated by using CTDMPLUS code. Through this study we find that the model which can treat terrain effect is essential in the atmospheric dispersion of radioactive materials and the CTDMPLUS model can be used as a useful tool. And it is suggested that modification of a model and experimental study should be made through the continuous effort. The health effect assessment near the Yonggwang site by using IPE (Individual plant examination) results and its site data was performed. The health effect assessment is an important part of consequence analysis of a nuclear power plant site. The MACCS was used in the assessment. Based on the calculation of CCDF for each risk measure, it is shown that CCDF has a slow slope and thus wide probability distribution in cases of early fatality, early injury, total early fatality risk, and total weighted early fatality risk. And in cases of cancer fatality and population dose within 48km and 80km, the CCDF curve have a steep slope and thus narrow probability distribution. The establishment of methodologies for necessary models for consequence analysis resulting form a server accident in the nuclear power plant was made and a program for consequence analysis was developed. The models include atmospheric transport and diffusion, calculation of exposure doses for various pathways, and assessment of health effects and associated risks. Finally, the economic impact resulting form an accident in a nuclear power plant was investigated. In this study, estimation models for each cost terms that considered in economic

  10. Mapping genetic influences on the corticospinal motor system in humans

    DEFF Research Database (Denmark)

    Cheeran, B J; Ritter, C; Rothwell, J C

    2009-01-01

    of the contribution of single nucleotide polymorphisms (SNP) and variable number tandem repeats. In humans, the corticospinal motor system is essential to the acquisition of fine manual motor skills which require a finely tuned coordination of activity in distal forelimb muscles. Here we review recent brain mapping......It is becoming increasingly clear that genetic variations account for a certain amount of variance in the acquisition and maintenance of different skills. Until now, several levels of genetic influences were examined, ranging from global heritability estimates down to the analysis...... studies that have begun to explore the influence of functional genetic variation as well as mutations on function and structure of the human corticospinal motor system, and also the clinical implications of these studies. Transcranial magnetic stimulation of the primary motor hand area revealed...

  11. Nonterminals and codings in defining variations of OL-systems

    DEFF Research Database (Denmark)

    Skyum, Sven

    1974-01-01

    The use of nonterminals versus the use of codings in variations of OL-systems is studied. It is shown that the use of nonterminals produces a comparatively low generative capacity in deterministic systems while it produces a comparatively high generative capacity in nondeterministic systems. Fina....... Finally it is proved that the family of context-free languages is contained in the family generated by codings on propagating OL-systems with a finite set of axioms, which was one of the open problems in [10]. All the results in this paper can be found in [71] and [72].......The use of nonterminals versus the use of codings in variations of OL-systems is studied. It is shown that the use of nonterminals produces a comparatively low generative capacity in deterministic systems while it produces a comparatively high generative capacity in nondeterministic systems...

  12. Health effects estimation code development for accident consequence analysis

    International Nuclear Information System (INIS)

    Togawa, O.; Homma, T.

    1992-01-01

    As part of a computer code system for nuclear reactor accident consequence analysis, two computer codes have been developed for estimating health effects expected to occur following an accident. Health effects models used in the codes are based on the models of NUREG/CR-4214 and are revised for the Japanese population on the basis of the data from the reassessment of the radiation dosimetry and information derived from epidemiological studies on atomic bomb survivors of Hiroshima and Nagasaki. The health effects models include early and continuing effects, late somatic effects and genetic effects. The values of some model parameters are revised for early mortality. The models are modified for predicting late somatic effects such as leukemia and various kinds of cancers. The models for genetic effects are the same as those of NUREG. In order to test the performance of one of these codes, it is applied to the U.S. and Japanese populations. This paper provides descriptions of health effects models used in the two codes and gives comparisons of the mortality risks from each type of cancer for the two populations. (author)

  13. Recent developments in the Los Alamos radiation transport code system

    International Nuclear Information System (INIS)

    Forster, R.A.; Parsons, K.

    1997-01-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results

  14. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Velkov, K. [GRS, Garching (Germany); Lizorkin, M. [Kurchatov-Institute, Moscow (Russian Federation)] [and others

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  15. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    Science.gov (United States)

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  16. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Stefan Bosse

    2015-02-01

    Full Text Available Multi-agent systems (MAS can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  17. HTR core physics and transient analyses by the Panthermix code system

    International Nuclear Information System (INIS)

    Haas, J.B.M. de; Kuijper, J.C.; Oppe, J.

    2005-01-01

    At NRG Petten, core physics analyses on High Temperature gas-cooled Reactors (HTRs) are mainly performed by means of the PANTHERMIX code system. Since some years NRG is developing the HTR reactor physics code system WIMS/PANTHERMIX, based on the lattice code WIMS (Serco Assurance, UK), the 3-dimensional steady-state and transient core physics code PANTHER (British Energy, UK) and the 2-dimensional R-Z HTR thermal hydraulics code THERMIX-DIREKT (Research Centre FZJ Juelich, Germany). By means of the WIMS code nuclear data are being generated to suit the PANTHER code's neutronics. At NRG the PANTHER code has been interfaced with THERMIX-DIREKT to form PANTHERMIX, to enable core-follow/fuel management and transient analyses in a consistent manner on pebble bed type HTR systems. Also provisions have been made to simulate the flow of pebbles through the core of a pebble bed HTR, according to a given (R-Z) flow pattern. As examples of the versatility of the PANTHERMIX code system, calculations are presented on the PBMR, the South African pebble bed reactor design, to show the transient capabilities, and on a plutonium burning MEDUL-reactor, to demonstrate the core-follow/fuel management capabilities. For the investigated cases a good agreement is observed with the results of other HTR core physics codes

  18. Free-space optical code-division multiple-access system design

    Science.gov (United States)

    Jeromin, Lori L.; Kaufmann, John E.; Bucher, Edward A.

    1993-08-01

    This paper describes an optical direct-detection multiple access communications system for free-space satellite networks utilizing code-division multiple-access (CDMA) and forward error correction (FEC) coding. System performance is characterized by how many simultaneous users operating at data rate R can be accommodated in a signaling bandwidth W. The performance of two CDMA schemes, optical orthogonal codes (OOC) with FEC and orthogonal convolutional codes (OCC), is calculated and compared to information-theoretic capacity bounds. The calculations include the effects of background and detector noise as well as nonzero transmitter extinction ratio and power imbalance among users. A system design for 10 kbps multiple-access communications between low-earth orbit satellites is given. With near- term receiver technology and representative system losses, a 15 W peak-power transmitter provides 10-6 BER performance with seven interfering users and full moon background in the receiver FOV. The receiver employs an array of discrete wide-area avalanche photodiodes (APD) for wide field of view coverage. Issues of user acquisition and synchronization, implementation technology, and system scalability are also discussed.

  19. A comparison of thermal algorithms of fuel rod performance code systems

    International Nuclear Information System (INIS)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C.

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance

  20. A comparison of thermal algorithms of fuel rod performance code systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance.

  1. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  2. Detecting Source Code Plagiarism on .NET Programming Languages using Low-level Representation and Adaptive Local Alignment

    Directory of Open Access Journals (Sweden)

    Oscar Karnalim

    2017-01-01

    Full Text Available Even though there are various source code plagiarism detection approaches, only a few works which are focused on low-level representation for deducting similarity. Most of them are only focused on lexical token sequence extracted from source code. In our point of view, low-level representation is more beneficial than lexical token since its form is more compact than the source code itself. It only considers semantic-preserving instructions and ignores many source code delimiter tokens. This paper proposes a source code plagiarism detection which rely on low-level representation. For a case study, we focus our work on .NET programming languages with Common Intermediate Language as its low-level representation. In addition, we also incorporate Adaptive Local Alignment for detecting similarity. According to Lim et al, this algorithm outperforms code similarity state-of-the-art algorithm (i.e. Greedy String Tiling in term of effectiveness. According to our evaluation which involves various plagiarism attacks, our approach is more effective and efficient when compared with standard lexical-token approach.

  3. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.

    Science.gov (United States)

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.

  4. An approach based on genetic algorithms with coding in real for the solution of a DC OPF to hydrothermal systems; Uma abordagem baseada em algoritmos geneticos com codificacao em real para a solucao de um FPO DC para sistemas hidrotermicos

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, Diego R.; Silva, Alessandro L. da; Luciano, Edson Jose Rezende; Nepomuceno, Leonardo [Universidade Estadual Paulista (UNESP), Bauru, SP (Brazil). Dept. de Engenharia Eletrica], Emails: diego_eng.eletricista@hotmail.com, alessandrolopessilva@uol.com.br, edson.joserl@uol.com.br, leo@feb.unesp.br

    2009-07-01

    Problems of DC Optimal Power Flow (OPF) have been solved by various conventional optimization methods. When the modeling of DC OPF involves discontinuous functions or not differentiable, the use of solution methods based on conventional optimization is often not possible because of the difficulty in calculating the gradient vectors at points of discontinuity/non-differentiability of these functions. This paper proposes a method for solving the DC OPF based on Genetic Algorithms (GA) with real coding. The proposed GA has specific genetic operators to improve the quality and viability of the solution. The results are analyzed for an IEEE test system, and its solutions are compared, when possible, with those obtained by a method of interior point primal-dual logarithmic barrier. The results highlight the robustness of the method and feasibility of obtaining the solution to real systems.

  5. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  6. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    Science.gov (United States)

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  7. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  8. [Assisted reproduction and artificial insemination and genetic manipulation in the Criminal Code of the Federal District, Mexico].

    Science.gov (United States)

    Brena Sesma, Ingrid

    2004-01-01

    The article that one presents has for purpose outline and comment on the recent modifications to the Penal Code for the Federal District of México which establish, for the first time, crimes related to the artificial procreation and to the genetic manipulation. Also one refers to the interaction of the new legal texts with the sanitary legislation of the country. Since it will be stated in some cases they present confrontations between the penal and the sanitary reglamentation and some points related to the legality or unlawfulness of a conduct that stayed without the enough development. These lacks will complicate the application of the new rules of the Penal Code of the Federal District.

  9. Validation of the VTT's reactor physics code system

    International Nuclear Information System (INIS)

    Tanskanen, A.

    1998-01-01

    At VTT Energy several international reactor physics codes and nuclear data libraries are used in a variety of applications. The codes and libraries are under constant development and every now and then new updated versions are released, which are taken in use as soon as they have been validated at VTT Energy. The primary aim of the validation is to ensure that the code works properly, and that it can be used correctly. Moreover, the applicability of the codes and libraries are studied in order to establish their advantages and weak points. The capability of generating program-specific nuclear data for different reactor physics codes starting from the same evaluated data is sometimes of great benefit. VTT Energy has acquired a nuclear data processing system based on the NJOY-94.105 and TRANSX-2.15 processing codes. The validity of the processing system has been demonstrated by generating pointwise (MCNP) and groupwise (ANISN) temperature-dependent cross section sets for the benchmark calculations of the Doppler coefficient of reactivity. At VTT Energy the KENO-VI three-dimensional Monte Carlo code is used in criticality safety analyses. The KENO-VI code and the 44GROUPNDF5 data library have been validated at VTT Energy against the ZR-6 and LR-0 critical experiments. Burnup Credit refers to the reduction in reactivity of burned nuclear fuel due to the change in composition during irradiation. VTT Energy has participated in the calculational VVER-440 burnup credit benchmark in order to validate criticality safety calculation tools. (orig.)

  10. V.S.O.P. (99/05) computer code system

    International Nuclear Information System (INIS)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Scherer, W.

    2005-11-01

    V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to HTRs and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. V.S.O.P.(99 / 05) represents the further development of V.S.O.P. (99). Compared to its precursor, the code system has been improved in many details. Major improvements and extensions have been included concerning the neutron spectrum calculation, the 3-d neutron diffusion options, and the thermal hydraulic section with respect to 'multi-pass'-fuelled pebblebed cores. This latest code version was developed and tested under the WINDOWS-XP - operating system. The storage requirement for the executables and the basic libraries associated with the code amounts to about 15 MB. Another 5 MB are required - if desired - for storage of the source code (∼65000 Fortran statements). (orig.)

  11. V.S.O.P. (99/05) computer code system

    Energy Technology Data Exchange (ETDEWEB)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Scherer, W.

    2005-11-01

    V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to HTRs and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. V.S.O.P.(99 / 05) represents the further development of V.S.O.P. (99). Compared to its precursor, the code system has been improved in many details. Major improvements and extensions have been included concerning the neutron spectrum calculation, the 3-d neutron diffusion options, and the thermal hydraulic section with respect to 'multi-pass'-fuelled pebblebed cores. This latest code version was developed and tested under the WINDOWS-XP - operating system. The storage requirement for the executables and the basic libraries associated with the code amounts to about 15 MB. Another 5 MB are required - if desired - for storage of the source code ({approx}65000 Fortran statements). (orig.)

  12. Development of FBR integrity system code. Basic concept

    International Nuclear Information System (INIS)

    Asayama, Tai

    2001-05-01

    For fast breeder reactors to be commercialized, they must be more reliable, safer, and at the same, economically competitive with future light water reactors. Innovation of elevated temperature structural design standard is necessary to achieve this goal. The most powerful way is to enlarge the scope of structural integrity code to cover items other than design evaluation that has been addressed in existing codes. Items that must be newly covered are prerequisites of design, fabrication, examination, operation and maintenance, etc. This allows designers to choose the most economical combination of design variations to achieve specific reliability that is needed for a particular component. Designing components by this concept, a cost-minimum design of a whole plant can be realized. By determining the reliability that must be achieved for a component by risk technologies, further economical improvement can be expected by avoiding excessive quality. Recognizing the necessity for the codes based on the new concept, the development of 'FBR integrity system code' began in 2000. Research and development will last 10 years. For this development, the basic logistics and system as well as technologies that materialize the concept are necessary. Original logistics and system must be developed, because no existing researches are available in and out of Japan. This reports presents the results of the work done in the first year regarding the basic idea, methodology, and structure of the code. (author)

  13. Design and Analysis of Self-Healing Tree-Based Hybrid Spectral Amplitude Coding OCDMA System

    Directory of Open Access Journals (Sweden)

    Waqas A. Imtiaz

    2017-01-01

    Full Text Available This paper presents an efficient tree-based hybrid spectral amplitude coding optical code division multiple access (SAC-OCDMA system that is able to provide high capacity transmission along with fault detection and restoration throughout the passive optical network (PON. Enhanced multidiagonal (EMD code is adapted to elevate system’s performance, which negates multiple access interference and associated phase induced intensity noise through efficient two-matrix structure. Moreover, system connection availability is enhanced through an efficient protection architecture with tree and star-ring topology at the feeder and distribution level, respectively. The proposed hybrid architecture aims to provide seamless transmission of information at minimum cost. Mathematical model based on Gaussian approximation is developed to analyze performance of the proposed setup, followed by simulation analysis for validation. It is observed that the proposed system supports 64 subscribers, operating at the data rates of 2.5 Gbps and above. Moreover, survivability and cost analysis in comparison with existing schemes show that the proposed tree-based hybrid SAC-OCDMA system provides the required redundancy at minimum cost of infrastructure and operation.

  14. 14 CFR Sec. 1-4 - System of accounts coding.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false System of accounts coding. Sec. 1-4 Section... General Accounting Provisions Sec. 1-4 System of accounts coding. (a) A four digit control number is assigned for each balance sheet and profit and loss account. Each balance sheet account is numbered...

  15. SWAT2: The improved SWAT code system by incorporating the continuous energy Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Okuno, Hiroshi

    2003-01-01

    SWAT is a code system, which performs the burnup calculation by the combination of the neutronics calculation code, SRAC95 and the one group burnup calculation code, ORIGEN2.1. The SWAT code system can deal with the cell geometry in SRAC95. However, a precise treatment of resonance absorptions by the SRAC95 code using the ultra-fine group cross section library is not directly applicable to two- or three-dimensional geometry models, because of restrictions in SRAC95. To overcome this problem, SWAT2 which newly introduced the continuous energy Monte Carlo code, MVP into SWAT was developed. Thereby, the burnup calculation by the continuous energy in any geometry became possible. Moreover, using the 147 group cross section library called SWAT library, the reactions which are not dealt with by SRAC95 and MVP can be treated. OECD/NEA burnup credit criticality safety benchmark problems Phase-IB (PWR, a single pin cell model) and Phase-IIIB (BWR, fuel assembly model) were calculated as a verification of SWAT2, and the results were compared with the average values of calculation results of burnup calculation code of each organization. Through two benchmark problems, it was confirmed that SWAT2 was applicable to the burnup calculation of the complicated geometry. (author)

  16. Context adaptive coding of bi-level images

    DEFF Research Database (Denmark)

    Forchhammer, Søren

    2008-01-01

    With the advent of sequential arithmetic coding, the focus of highly efficient lossless data compression is placed on modelling the data. Rissanen's Algorithm Context provided an elegant solution to universal coding with optimal convergence rate. Context based arithmetic coding laid the grounds f...

  17. HTR core physics and transient analyses by the Panthermix code system

    Energy Technology Data Exchange (ETDEWEB)

    Haas, J.B.M. de; Kuijper, J.C.; Oppe, J. [NRG - Fuels, Actinides and Isotopes group, Petten (Netherlands)

    2005-07-01

    At NRG Petten, core physics analyses on High Temperature gas-cooled Reactors (HTRs) are mainly performed by means of the PANTHERMIX code system. Since some years NRG is developing the HTR reactor physics code system WIMS/PANTHERMIX, based on the lattice code WIMS (Serco Assurance, UK), the 3-dimensional steady-state and transient core physics code PANTHER (British Energy, UK) and the 2-dimensional R-Z HTR thermal hydraulics code THERMIX-DIREKT (Research Centre FZJ Juelich, Germany). By means of the WIMS code nuclear data are being generated to suit the PANTHER code's neutronics. At NRG the PANTHER code has been interfaced with THERMIX-DIREKT to form PANTHERMIX, to enable core-follow/fuel management and transient analyses in a consistent manner on pebble bed type HTR systems. Also provisions have been made to simulate the flow of pebbles through the core of a pebble bed HTR, according to a given (R-Z) flow pattern. As examples of the versatility of the PANTHERMIX code system, calculations are presented on the PBMR, the South African pebble bed reactor design, to show the transient capabilities, and on a plutonium burning MEDUL-reactor, to demonstrate the core-follow/fuel management capabilities. For the investigated cases a good agreement is observed with the results of other HTR core physics codes.

  18. A nuclear reload optimization approach using a real coded genetic algorithm with random keys

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    The fuel reload of a Pressurized Water Reactor is made whenever the burn up of the fuel assemblies in the nucleus of the reactor reaches a certain value such that it is not more possible to maintain a critical reactor producing energy at nominal power. The problem of fuel reload optimization consists on determining the positioning of the fuel assemblies within the nucleus of the reactor in an optimized way to minimize the cost benefit relationship of fuel assemblies cost per maximum burn up, and also satisfying symmetry and safety restrictions. The fuel reload optimization problem difficulty grows exponentially with the number of fuel assemblies in the nucleus of the reactor. During decades the fuel reload optimization problem was solved manually by experts that used their knowledge and experience to build configurations of the reactor nucleus, and testing them to verify if safety restrictions of the plant are satisfied. To reduce this burden, several optimization techniques have been used, included the binary code genetic algorithm. In this work we show the use of a real valued coded approach of the genetic algorithm, with different recombination methods, together with a transformation mechanism called random keys, to transform the real values of the genes of each chromosome in a combination of discrete fuel assemblies for evaluation of the reload optimization. Four different recombination methods were tested: discrete recombination, intermediate recombination, linear recombination and extended linear recombination. For each of the 4 recombination methods 10 different tests using different seeds for the random number generator were conducted 10 generating, totaling 40 tests. The results of the application of the genetic algorithm are shown with formulation of real numbers for the problem of the nuclear reload of the plant Angra 1 type PWR. Since the best results in the literature for this problem were found by the parallel PSO we will it use for comparison

  19. Coding training for medical students: How good is diagnoses coding with ICD-10 by novices?

    Directory of Open Access Journals (Sweden)

    Stausberg, Jürgen

    2005-04-01

    Full Text Available Teaching of knowledge and competence in documentation and coding is an essential part of medical education. Therefore, coding training had been placed within the course of epidemiology, medical biometry, and medical informatics. From this, we can draw conclusions about the quality of coding by novices. One hundred and eighteen students coded diagnoses from 15 nephrological cases in homework. In addition to interrater reliability, validity was calculated by comparison with a reference coding. On the level of terminal codes, 59.3% of the students' results were correct. The completeness was calculated as 58.0%. The results on the chapter level increased up to 91.5% and 87.7% respectively. For the calculation of reliability a new, simple measure was developed that leads to values of 0.46 on the level of terminal codes and 0.87 on the chapter level for interrater reliability. The figures of concordance with the reference coding are quite similar. In contrary, routine data show considerably lower results with 0.34 and 0.63 respectively. Interrater reliability and validity of coding by novices is as good as coding by experts. The missing advantage of experts could be explained by the workload of documentation and a negative attitude to coding on the one hand. On the other hand, coding in a DRG-system is handicapped by a large number of detailed coding rules, which do not end in uniform results but rather lead to wrong and random codes. Anyway, students left the course well prepared for coding.

  20. Particle and heavy ion transport code system; PHITS

    International Nuclear Information System (INIS)

    Niita, Koji

    2004-01-01

    Intermediate and high energy nuclear data are strongly required in design study of many facilities such as accelerator-driven systems, intense pulse spallation neutron sources, and also in medical and space technology. There is, however, few evaluated nuclear data of intermediate and high energy nuclear reactions. Therefore, we have to use some models or systematics for the cross sections, which are essential ingredients of high energy particle and heavy ion transport code to estimate neutron yield, heat deposition and many other quantities of the transport phenomena in materials. We have developed general purpose particle and heavy ion transport Monte Carlo code system, PHITS (Particle and Heavy Ion Transport code System), based on the NMTC/JAM code by the collaboration of Tohoku University, JAERI and RIST. The PHITS has three important ingredients which enable us to calculate (1) high energy nuclear reactions up to 200 GeV, (2) heavy ion collision and its transport in material, (3) low energy neutron transport based on the evaluated nuclear data. In the PHITS, the cross sections of high energy nuclear reactions are obtained by JAM model. JAM (Jet AA Microscopic Transport Model) is a hadronic cascade model, which explicitly treats all established hadronic states including resonances and all hadron-hadron cross sections parametrized based on the resonance model and string model by fitting the available experimental data. The PHITS can describe the transport of heavy ions and their collisions by making use of JQMD and SPAR code. The JQMD (JAERI Quantum Molecular Dynamics) is a simulation code for nucleus nucleus collisions based on the molecular dynamics. The SPAR code is widely used to calculate the stopping powers and ranges for charged particles and heavy ions. The PHITS has included some part of MCNP4C code, by which the transport of low energy neutron, photon and electron based on the evaluated nuclear data can be described. Furthermore, the high energy nuclear

  1. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  2. Application of startup/core management code system to YGN 3 startup testing

    International Nuclear Information System (INIS)

    Chi, Sung Goo; Hah, Yung Joon; Doo, Jin Yong; Kim, Dae Kyum

    1995-01-01

    YGN 3 is the first nuclear power plant in Korea to use the fixed incore detector system for startup testing and core management. The startup/core management code system was developed from existing ABB-C-E codes and applied for YGN 3 startup testing, especially for physics and CPC(Core Protection Calculator)/COLSS (Core Operating Limit Supervisory System) related testing. The startup/core management code system consists of startup codes which include the CEBASE, CECOR, CEFAST and CEDOPS, and startup data reduction codes which include FLOWRATE, COREPERF, CALMET, and VARTAV. These codes were implemented on an HP/Apollo model 9000 series 400 workstation at the YGN 3 site and successfully applied to startup testing and core management. The startup codes made a great contribution in upgrading the reliability of test results and reducing the test period by taking and analyzing core data automatically. The data reduction code saved the manpower and time for test data reduction and decreased the chance for error in the analysis. It is expected that this code system will make similar contributions for reducing the startup testing duration of YGN 4 and UCN3,4

  3. Discovery of Proteomic Code with mRNA Assisted Protein Folding

    Directory of Open Access Journals (Sweden)

    Jan C. Biro

    2008-12-01

    Full Text Available The 3x redundancy of the Genetic Code is usually explained as a necessity to increase the mutation-resistance of the genetic information. However recent bioinformatical observations indicate that the redundant Genetic Code contains more biological information than previously known and which is additional to the 64/20 definition of amino acids. It might define the physico-chemical and structural properties of amino acids, the codon boundaries, the amino acid co-locations (interactions in the coded proteins and the free folding energy of mRNAs. This additional information, which seems to be necessary to determine the 3D structure of coding nucleic acids as well as the coded proteins, is known as the Proteomic Code and mRNA Assisted Protein Folding.

  4. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  5. Performance analysis of 2D asynchronous hard-limiting optical code-division multiple access system through atmospheric scattering channel

    Science.gov (United States)

    Zhao, Yaqin; Zhong, Xin; Wu, Di; Zhang, Ye; Ren, Guanghui; Wu, Zhilu

    2013-09-01

    Optical code-division multiple access (OCDMA) systems usually allocate orthogonal or quasi-orthogonal codes to the active users. When transmitting through atmospheric scattering channel, the coding pulses are broadened and the orthogonality of the codes is worsened. In truly asynchronous case, namely both the chips and the bits are asynchronous among each active user, the pulse broadening affects the system performance a lot. In this paper, we evaluate the performance of a 2D asynchronous hard-limiting wireless OCDMA system through atmospheric scattering channel. The probability density function of multiple access interference in truly asynchronous case is given. The bit error rate decreases as the ratio of the chip period to the root mean square delay spread increases and the channel limits the bit rate to different levels when the chip period varies.

  6. Marburg Virus Reverse Genetics Systems

    Directory of Open Access Journals (Sweden)

    Kristina Maria Schmidt

    2016-06-01

    Full Text Available The highly pathogenic Marburg virus (MARV is a member of the Filoviridae family and belongs to the group of nonsegmented negative-strand RNA viruses. Reverse genetics systems established for MARV have been used to study various aspects of the viral replication cycle, analyze host responses, image viral infection, and screen for antivirals. This article provides an overview of the currently established MARV reverse genetic systems based on minigenomes, infectious virus-like particles and full-length clones, and the research that has been conducted using these systems.

  7. Geographic Information Systems using CODES linked data (Crash outcome data evaluation system)

    Science.gov (United States)

    2001-04-01

    This report presents information about geographic information systems (GIS) and CODES linked data. Section one provides an overview of a GIS and the benefits of linking to CODES. Section two outlines the basic issues relative to the types of map data...

  8. Sub-channel/system coupled code development and its application to SCWR-FQT loop

    International Nuclear Information System (INIS)

    Liu, X.J.; Cheng, X.

    2015-01-01

    Highlights: • A coupled code is developed for SCWR accident simulation. • The feasibility of the code is shown by application to SCWR-FQT loop. • Some measures are selected by sensitivity analysis. • The peak cladding temperature can be reduced effectively by the proposed measures. - Abstract: In the frame of Super-Critical Reactor In Pipe Test Preparation (SCRIPT) project in China, one of the challenge tasks is to predict the transient performance of SuperCritical Water Reactor-Fuel Qualification Test (SCWR-FQT) loop under some accident conditions. Several thermal–hydraulic codes (system code, sub-channel code) are selected to perform the safety analysis. However, the system code cannot simulate the local behavior of the test bundle, and the sub-channel code is incapable of calculating the whole system behavior of the test loop. Therefore, to combine the merits of both codes, and minimizes their shortcomings, a coupled sub-channel and system code system is developed in this paper. Both of the sub-channel code COBRA-SC and system code ATHLET-SC are adapted to transient analysis of SCWR. Two codes are coupled by data transfer and data adaptation at the interface. In the new developed coupled code, the whole system behavior including safety system characteristic is analyzed by system code ATHLET-SC, whereas the local thermal–hydraulic parameters are predicted by the sub-channel code COBRA-SC. The codes are utilized to get the local thermal–hydraulic parameters in the SCWR-FQT fuel bundle under some accident case (e.g. a flow blockage during LOCA). Some measures to mitigate the accident consequence are proposed by the sensitivity study and trialed to demonstrate their effectiveness in the coupled simulation. The results indicate that the new developed code has good feasibility to transient analysis of supercritical water-cooled test. And the peak cladding temperature caused by blockage in the fuel bundle can be reduced effectively by the safety measures

  9. Sub-channel/system coupled code development and its application to SCWR-FQT loop

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.J., E-mail: xiaojingliu@sjtu.edu.cn [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dong Chuan Road, Shanghai 200240 (China); Cheng, X. [Institute of Fusion and Reactor Technology, Karlsruhe Institute of Technology, Vincenz-Prießnitz-Str. 3, 76131 Karlsruhe (Germany)

    2015-04-15

    Highlights: • A coupled code is developed for SCWR accident simulation. • The feasibility of the code is shown by application to SCWR-FQT loop. • Some measures are selected by sensitivity analysis. • The peak cladding temperature can be reduced effectively by the proposed measures. - Abstract: In the frame of Super-Critical Reactor In Pipe Test Preparation (SCRIPT) project in China, one of the challenge tasks is to predict the transient performance of SuperCritical Water Reactor-Fuel Qualification Test (SCWR-FQT) loop under some accident conditions. Several thermal–hydraulic codes (system code, sub-channel code) are selected to perform the safety analysis. However, the system code cannot simulate the local behavior of the test bundle, and the sub-channel code is incapable of calculating the whole system behavior of the test loop. Therefore, to combine the merits of both codes, and minimizes their shortcomings, a coupled sub-channel and system code system is developed in this paper. Both of the sub-channel code COBRA-SC and system code ATHLET-SC are adapted to transient analysis of SCWR. Two codes are coupled by data transfer and data adaptation at the interface. In the new developed coupled code, the whole system behavior including safety system characteristic is analyzed by system code ATHLET-SC, whereas the local thermal–hydraulic parameters are predicted by the sub-channel code COBRA-SC. The codes are utilized to get the local thermal–hydraulic parameters in the SCWR-FQT fuel bundle under some accident case (e.g. a flow blockage during LOCA). Some measures to mitigate the accident consequence are proposed by the sensitivity study and trialed to demonstrate their effectiveness in the coupled simulation. The results indicate that the new developed code has good feasibility to transient analysis of supercritical water-cooled test. And the peak cladding temperature caused by blockage in the fuel bundle can be reduced effectively by the safety measures

  10. Development of the vacuum system pressure responce analysis code PRAC

    International Nuclear Information System (INIS)

    Horie, Tomoyoshi; Kawasaki, Kouzou; Noshiroya, Shyoji; Koizumi, Jun-ichi.

    1985-03-01

    In this report, we show the method and numerical results of the vacuum system pressure responce analysis code. Since fusion apparatus is made up of many vacuum components, it is required to analyze pressure responce at any points of the system when vacuum system is designed or evaluated. For that purpose evaluating by theoretical solution is insufficient. Numerical analysis procedure such as finite difference method is usefull. In the PRAC code (Pressure Responce Analysis Code), pressure responce is obtained solving derivative equations which is obtained from the equilibrium relation of throughputs and contain the time derivative of pressure. As it considers both molecular and viscous flows, the coefficients of the equation depend on the pressure and the equations become non-linear. This non-linearity is treated as piece-wise linear within each time step. Verification of the code is performed for the simple problems. The agreement between numerical and theoretical solutions is good. To compare with the measured results, complicated model of gas puffing system is analyzed. The agreement is well for practical use. This code will be a useful analytical tool for designing and evaluating vacuum systems such as fusion apparatus. (author)

  11. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey

    OpenAIRE

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospi...

  12. Impact of the Level of State Tax Code Progressivity on Children's Health Outcomes

    Science.gov (United States)

    Granruth, Laura Brierton; Shields, Joseph J.

    2011-01-01

    This research study examines the impact of the level of state tax code progressivity on selected children's health outcomes. Specifically, it examines the degree to which a state's tax code ranking along the progressive-regressive continuum relates to percentage of low birthweight babies, infant and child mortality rates, and percentage of…

  13. Genetic variability in five species of Anostomidae (Ostariophysi - Characiformes

    Directory of Open Access Journals (Sweden)

    Chiari Lucimara

    1999-01-01

    Full Text Available Genetic variability was studied in five fish species (Anostomidae: Schizodon intermedius and S. nasutus and Leporinus friderici, L. elongatus and L. obtusidens, collected at one location on the Tibagi River (Paraná, Brazil. The protein data from seven systems coded collectively for 19 loci in the liver, muscle and heart. Nine of these loci were polymorphic. The estimated proportion of polymorphism loci ( varied from 16.7% in S. intermedius to 36.9% in L. friderici; the mean heterozygosity observed (o was 0.027 ± 0.015 and 0.109 ± 0.042, respectively. The estimated value of the genetic identity among L. friderici and S. intermedius (0.749 and S. nasutus (0.787 suggested that these are "congeneric" species. Morphological characteristics indicate that these species belong to distinct genera, while isoenzymatic data show that they are very similar at the genetic/biochemical level.

  14. Thermal-Hydraulic System Codes in Nulcear Reactor Safety and Qualification Procedures

    Directory of Open Access Journals (Sweden)

    Alessandro Petruzzi

    2008-01-01

    Full Text Available In the last four decades, large efforts have been undertaken to provide reliable thermal-hydraulic system codes for the analyses of transients and accidents in nuclear power plants. Whereas the first system codes, developed at the beginning of the 1970s, utilized the homogenous equilibrium model with three balance equations to describe the two-phase flow, nowadays the more advanced system codes are based on the so-called “two-fluid model” with separation of the water and vapor phases, resulting in systems with at least six balance equations. The wide experimental campaign, constituted by the integral and separate effect tests, conducted under the umbrella of the OECD/CSNI was at the basis of the development and validation of the thermal-hydraulic system codes by which they have reached the present high degree of maturity. However, notwithstanding the huge amounts of financial and human resources invested, the results predicted by the code are still affected by errors whose origins can be attributed to several reasons as model deficiencies, approximations in the numerical solution, nodalization effects, and imperfect knowledge of boundary and initial conditions. In this context, the existence of qualified procedures for a consistent application of qualified thermal-hydraulic system code is necessary and implies the drawing up of specific criteria through which the code-user, the nodalization, and finally the transient results are qualified.

  15. SRAC95; general purpose neutronics code system

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Tsuchihashi, Keichiro; Kaneko, Kunio.

    1996-03-01

    SRAC is a general purpose neutronics code system applicable to core analyses of various types of reactors. Since the publication of JAERI-1302 for the revised SRAC in 1986, a number of additions and modifications have been made for nuclear data libraries and programs. Thus, the new version SRAC95 has been completed. The system consists of six kinds of nuclear data libraries(ENDF/B-IV, -V, -VI, JENDL-2, -3.1, -3.2), five modular codes integrated into SRAC95; collision probability calculation module (PIJ) for 16 types of lattice geometries, Sn transport calculation modules(ANISN, TWOTRAN), diffusion calculation modules(TUD, CITATION) and two optional codes for fuel assembly and core burn-up calculations(newly developed ASMBURN, revised COREBN). In this version, many new functions and data are implemented to support nuclear design studies of advanced reactors, especially for burn-up calculations. SRAC95 is available not only on conventional IBM-compatible computers but also on scalar or vector computers with the UNIX operating system. This report is the SRAC95 users manual which contains general description, contents of revisions, input data requirements, detail information on usage, sample input data and list of available libraries. (author)

  16. SRAC95; general purpose neutronics code system

    Energy Technology Data Exchange (ETDEWEB)

    Okumura, Keisuke; Tsuchihashi, Keichiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-03-01

    SRAC is a general purpose neutronics code system applicable to core analyses of various types of reactors. Since the publication of JAERI-1302 for the revised SRAC in 1986, a number of additions and modifications have been made for nuclear data libraries and programs. Thus, the new version SRAC95 has been completed. The system consists of six kinds of nuclear data libraries(ENDF/B-IV, -V, -VI, JENDL-2, -3.1, -3.2), five modular codes integrated into SRAC95; collision probability calculation module (PIJ) for 16 types of lattice geometries, Sn transport calculation modules(ANISN, TWOTRAN), diffusion calculation modules(TUD, CITATION) and two optional codes for fuel assembly and core burn-up calculations(newly developed ASMBURN, revised COREBN). In this version, many new functions and data are implemented to support nuclear design studies of advanced reactors, especially for burn-up calculations. SRAC95 is available not only on conventional IBM-compatible computers but also on scalar or vector computers with the UNIX operating system. This report is the SRAC95 users manual which contains general description, contents of revisions, input data requirements, detail information on usage, sample input data and list of available libraries. (author).

  17. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  18. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  19. Genetic and environmental influences of surfactant protein D serum levels

    DEFF Research Database (Denmark)

    Sorensen, G.L.; Hjelmborg, J.V.; Kyvik, K.O.

    2006-01-01

    defining the constitutional serum level of SP-D and determine the magnitude of the genetic contribution to serum SP-D in the adult population. Recent studies have demonstrated that serum SP-D concentrations in children are genetically determined and that a single nucleotide polymorphism (SNP) located...... in the NH(2)-terminal region (Met11Thr) of the mature protein is significantly associated with the serum SP-D levels. A classic twin study was performed on a twin population including 1,476 self-reported healthy adults. The serum SP-D levels increased with male sex, age, and smoking status. The intraclass...

  20. Genetic and environmental influences of surfactant protein D serum levels

    DEFF Research Database (Denmark)

    Sørensen, Grith Lykke; Hjelmborg, Jacob v. B.; Kyvik, Kirsten Ohm

    2006-01-01

    in the NH(2)-terminal region (Met11Thr) of the mature protein is significantly associated with the serum SP-D levels. A classic twin study was performed on a twin population including 1,476 self-reported healthy adults. The serum SP-D levels increased with male sex, age, and smoking status. The intraclass...... defining the constitutional serum level of SP-D and determine the magnitude of the genetic contribution to serum SP-D in the adult population. Recent studies have demonstrated that serum SP-D concentrations in children are genetically determined and that a single nucleotide polymorphism (SNP) located...

  1. Coupled CFD - system-code simulation of a gas cooled reactor

    International Nuclear Information System (INIS)

    Yan, Yizhou; Rizwan-uddin

    2011-01-01

    A generic coupled CFD - system-code thermal hydraulic simulation approach was developed based on FLUENT and RELAP-3D, and applied to LWRs. The flexibility of the coupling methodology enables its application to advanced nuclear energy systems. Gas Turbine - Modular Helium Reactor (GT-MHR) is a Gen IV reactor design which can benefit from this innovative coupled simulation approach. Mixing in the lower plenum of the GT-MHR is investigated here using the CFD - system-code coupled simulation tool. Results of coupled simulations are presented and discussed. The potential of the coupled CFD - system-code approach for next generation of nuclear power plants is demonstrated. (author)

  2. A modified carrier-to-code leveling method for retrieving ionospheric observables and detecting short-term temporal variability of receiver differential code biases

    Science.gov (United States)

    Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Xiao; Li, Min

    2018-03-01

    Sensing the ionosphere with the global positioning system involves two sequential tasks, namely the ionospheric observable retrieval and the ionospheric parameter estimation. A prominent source of error has long been identified as short-term variability in receiver differential code bias (rDCB). We modify the carrier-to-code leveling (CCL), a method commonly used to accomplish the first task, through assuming rDCB to be unlinked in time. Aside from the ionospheric observables, which are affected by, among others, the rDCB at one reference epoch, the Modified CCL (MCCL) can also provide the rDCB offsets with respect to the reference epoch as by-products. Two consequences arise. First, MCCL is capable of excluding the effects of time-varying rDCB from the ionospheric observables, which, in turn, improves the quality of ionospheric parameters of interest. Second, MCCL has significant potential as a means to detect between-epoch fluctuations experienced by rDCB of a single receiver.

  3. Project of decree relative to the licensing and statement system of nuclear activities and to their control and bearing various modifications of the public health code and working code

    International Nuclear Information System (INIS)

    2005-01-01

    This decree concerns the control of high level sealed radioactive sources and orphan sources. It has for objective to introduce administrative simplification, especially the radiation sources licensing and statement system, to reinforce the control measures planed by the public health code and by the employment code, to bring precision and complements in the editing of several already existing arrangements. (N.C.)

  4. Genetic Algorithm-Based Identification of Fractional-Order Systems

    Directory of Open Access Journals (Sweden)

    Shengxi Zhou

    2013-05-01

    Full Text Available Fractional calculus has become an increasingly popular tool for modeling the complex behaviors of physical systems from diverse domains. One of the key issues to apply fractional calculus to engineering problems is to achieve the parameter identification of fractional-order systems. A time-domain identification algorithm based on a genetic algorithm (GA is proposed in this paper. The multi-variable parameter identification is converted into a parameter optimization by applying GA to the identification of fractional-order systems. To evaluate the identification accuracy and stability, the time-domain output error considering the condition variation is designed as the fitness function for parameter optimization. The identification process is established under various noise levels and excitation levels. The effects of external excitation and the noise level on the identification accuracy are analyzed in detail. The simulation results show that the proposed method could identify the parameters of both commensurate rate and non-commensurate rate fractional-order systems from the data with noise. It is also observed that excitation signal is an important factor influencing the identification accuracy of fractional-order systems.

  5. Differential Space-Time Block Code Modulation for DS-CDMA Systems

    Directory of Open Access Journals (Sweden)

    Liu Jianhua

    2002-01-01

    Full Text Available A differential space-time block code (DSTBC modulation scheme is used to improve the performance of DS-CDMA systems in fast time-dispersive fading channels. The resulting scheme is referred to as the differential space-time block code modulation for DS-CDMA (DSTBC-CDMA systems. The new modulation and demodulation schemes are especially studied for the down-link transmission of DS-CDMA systems. We present three demodulation schemes, referred to as the differential space-time block code Rake (D-Rake receiver, differential space-time block code deterministic (D-Det receiver, and differential space-time block code deterministic de-prefix (D-Det-DP receiver, respectively. The D-Det receiver exploits the known information of the spreading sequences and their delayed paths deterministically besides the Rake type combination; consequently, it can outperform the D-Rake receiver, which employs the Rake type combination only. The D-Det-DP receiver avoids the effect of intersymbol interference and hence can offer better performance than the D-Det receiver.

  6. A study on the nuclear computer codes installation and management system

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Huh, Young Hwan; Kim, Hee Kyung; Kang, Byung Heon; Kim, Ko Ryeo; Suh, Soong Hyok; Choi, Young Gil; Lee, Jong Bok

    1990-12-01

    From 1987 a number of technical transfer related to nuclear power plant had been performed from C-E for YGN 3 and 4 construction. Among them, installation and management of the computer codes for YGN 3 and 4 fuel and nuclear steam supply system was one of the most important project. Main objectives of this project are to establish the nuclear computer code management system, to develop QA procedure for nuclear codes, to secure the nuclear code reliability and to extend techanical applicabilities including the user-oriented utility programs for nuclear codes. Contents of performing the project in this year was to produce 215 transmittal packages of nuclear codes installation including making backup magnetic tape and microfiche for software quality assurance. Lastly, for easy reference about the nuclear codes information we presented list of code names and information on the codes which were introduced from C-E. (Author)

  7. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  8. International outage coding system for nuclear power plants. Results of a co-ordinated research project

    International Nuclear Information System (INIS)

    2004-05-01

    The experience obtained in each individual plant constitutes the most relevant source of information for improving its performance. However, experience of the level of the utility, country and worldwide is also extremely valuable, because there are limitations to what can be learned from in-house experience. But learning from the experience of others is admittedly difficult, if the information is not harmonized. Therefore, such systems should be standardized and applicable to all types of reactors satisfying the needs of the broad set of nuclear power plant operators worldwide and allowing experience to be shared internationally. To cope with the considerable amount of information gathered from nuclear power plants worldwide, it is necessary to codify the information facilitating the identification of causes of outages, systems or component failures. Therefore, the IAEA established a sponsored Co-ordinated Research Project (CRP) on the International Outage Coding System to develop a general, internationally applicable system of coding nuclear power plant outages, providing worldwide nuclear utilities with a standardized tool for reporting outage information. This TECDOC summarizes the results of this CRP and provides information for transformation of the historical outage data into the new coding system, taking into consideration the existing systems for coding nuclear power plant events (WANO, IAEA-IRS and IAEA PRIS) but avoiding duplication of efforts to the maximum possible extent

  9. Perspectives on the development of next generation reactor systems safety analysis codes

    International Nuclear Information System (INIS)

    Zhang, H.

    2015-01-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  10. Perspectives on the development of next generation reactor systems safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-07-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  11. Discovery of coding genetic variants influencing diabetes-related serum biomarkers and their impact on risk of type 2 diabetes

    DEFF Research Database (Denmark)

    Ahluwalia, Tarun Veer Singh; Allin, Kristine Højgaard; Sandholt, Camilla Helene

    2015-01-01

    CONTEXT: Type 2 diabetes (T2D) prevalence is spiraling globally, and knowledge of its pathophysiological signatures is crucial for a better understanding and treatment of the disease. OBJECTIVE: We aimed to discover underlying coding genetic variants influencing fasting serum levels of nine......-nucleotide polymorphisms and were tested for association with each biomarker. Identified loci were tested for association with T2D through a large-scale meta-analysis involving up to 17 024 T2D cases and up to 64 186 controls. RESULTS: We discovered 11 associations between single-nucleotide polymorphisms and five distinct......, of which the association with the CELSR2 locus has not been shown previously. CONCLUSION: The identified loci influence processes related to insulin signaling, cell communication, immune function, apoptosis, DNA repair, and oxidative stress, all of which could provide a rationale for novel diabetes...

  12. Invited commentary: genetic variants and individual- and societal-level risk factors.

    Science.gov (United States)

    Coughlin, Steven S

    2010-01-01

    Over the past decade, leading epidemiologists have noted the importance of social factors in studying and understanding the distribution and determinants of disease in human populations; but to what extent are epidemiologic studies integrating genetic information and other biologic variables with information about individual-level risk factors and group-level or societal factors related to the broader residential, behavioral, or cultural context? There remains a need to consider ways to integrate genetic information with social and contextual information in epidemiologic studies, partly to combat the overemphasis on the importance of genetic factors as determinants of disease in human populations. Even in genome-wide association studies of coronary heart disease and other common complex diseases, only a small proportion of heritability is explained by the genetic variants identified to date. It is possible that familial clustering due to genetic factors has been overestimated and that important environmental or social influences (acting alone or in combination with genetic variants) have been overlooked. The accompanying article by Bressler et al. (Am J Epidemiol. 2010;171(1):14-23) highlights some of these important issues.

  13. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  14. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Del Valle G, E.

    2015-09-01

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  15. A multi-level code for metallurgical effects in metal-forming processes

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, P.A.; Silling, S.A. [Sandia National Labs., Albuquerque, NM (United States). Computational Physics and Mechanics Dept.; Hughes, D.A.; Bammann, D.J.; Chiesa, M.L. [Sandia National Labs., Livermore, CA (United States)

    1997-08-01

    The authors present the final report on a Laboratory-Directed Research and Development (LDRD) project, A Multi-level Code for Metallurgical Effects in metal-Forming Processes, performed during the fiscal years 1995 and 1996. The project focused on the development of new modeling capabilities for simulating forging and extrusion processes that typically display phenomenology occurring on two different length scales. In support of model fitting and code validation, ring compression and extrusion experiments were performed on 304L stainless steel, a material of interest in DOE nuclear weapons applications.

  16. Coded aperture imaging system for nuclear fuel motion detection

    International Nuclear Information System (INIS)

    Stalker, K.T.; Kelly, J.G.

    1980-01-01

    A Coded Aperature Imaging System (CAIS) has been developed at Sandia National Laboratories to image the motion of nuclear fuel rods undergoing tests simulating accident conditions within a liquid metal fast breeder reactor. The tests require that the motion of the test fuel be monitored while it is immersed in a liquid sodium coolant precluding the use of normal optical means of imaging. However, using the fission gamma rays emitted by the fuel itself and coded aperture techniques, images with 1.5 mm radial and 5 mm axial resolution have been attained. Using an electro-optical detection system coupled to a high speed motion picture camera a time resolution of one millisecond can be achieved. This paper will discuss the application of coded aperture imaging to the problem, including the design of the one-dimensional Fresnel zone plate apertures used and the special problems arising from the reactor environment and use of high energy gamma ray photons to form the coded image. Also to be discussed will be the reconstruction techniques employed and the effect of various noise sources on system performance. Finally, some experimental results obtained using the system will be presented

  17. Genetic variants in long non-coding RNA MIAT contribute to risk of paranoid schizophrenia in a Chinese Han population.

    Science.gov (United States)

    Rao, Shu-Quan; Hu, Hui-Ling; Ye, Ning; Shen, Yan; Xu, Qi

    2015-08-01

    The heritability of schizophrenia has been reported to be as high as ~80%, but the contribution of genetic variants identified to this heritability remains to be estimated. Long non-coding RNAs (LncRNAs) are involved in multiple processes critical to normal cellular function and dysfunction of lncRNA MIAT may contribute to the pathophysiology of schizophrenia. However, the genetic evidence of lncRNAs involved in schizophrenia has not been documented. Here, we conducted a two-stage association analysis on 8 tag SNPs that cover the whole MIAT locus in two independent Han Chinese schizophrenia case-control cohorts (discovery sample from Shanxi Province: 1093 patients with paranoid schizophrenia and 1180 control subjects; replication cohort from Jilin Province: 1255 cases and 1209 healthy controls). In discovery stage, significant genetic association with paranoid schizophrenia was observed for rs1894720 (χ(2)=74.20, P=7.1E-18), of which minor allele (T) had an OR of 1.70 (95% CI=1.50-1.91). This association was confirmed in the replication cohort (χ(2)=22.66, P=1.9E-06, OR=1.32, 95%CI 1.18-1.49). Besides, a weak genotypic association was detected for rs4274 (χ(2)=4.96, df=2, P=0.03); the AA carriers showed increased disease risk (OR=1.30, 95%CI=1.03-1.64). No significant association was found between any haplotype and paranoid schizophrenia. The present studies showed that lncRNA MIAT was a novel susceptibility gene for paranoid schizophrenia in the Chinese Han population. Considering that most lncRNAs locate in non-coding regions, our result may explain why most susceptibility loci for schizophrenia identified by genome wide association studies were out of coding regions. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Integrating bar-code devices with computerized MC and A systems

    International Nuclear Information System (INIS)

    Anderson, L.K.; Boor, M.G.; Hurford, J.M.

    1998-01-01

    Over the past seven years, Los Alamos National Laboratory developed several generations of computerized nuclear materials control and accountability (MC and A) systems for tracking and reporting the storage, movement, and management of nuclear materials at domestic and international facilities. During the same period, Oak Ridge National Laboratory was involved with automated data acquisition (ADA) equipment, including installation of numerous bar-code scanning stations at various facilities to serve as input devices to computerized systems. Bar-code readers, as well as other ADA devices, reduce input errors, provide faster input, and allow the capture of data in remote areas where workstations do not exist. Los Alamos National Laboratory and Oak Ridge National Laboratory teamed together to implement the integration of bar-code hardware technology with computerized MC and A systems. With the expertise of both sites, the two technologies were successfully merged with little difficulty. Bar-code input is now available with several functions of the MC and A systems: material movements within material balance areas (MBAs), material movements between MBAs, and physical inventory verification. This paper describes the various components required for the integration of these MC and A systems with the installed bar-code reader devices and the future directions for these technologies

  19. The CMS High Level Trigger System

    CERN Document Server

    Afaq, A; Bauer, G; Biery, K; Boyer, V; Branson, J; Brett, A; Cano, E; Carboni, A; Cheung, H; Ciganek, M; Cittolin, S; Dagenhart, W; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Kowalkowski, J; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sexton-Kennedy, E; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition (DAQ) System relies on a purely software driven High Level Trigger (HLT) to reduce the full Level-1 accept rate of 100 kHz to approximately 100 Hz for archiving and later offline analysis. The HLT operates on the full information of events assembled by an event builder collecting detector data from the CMS front-end systems. The HLT software consists of a sequence of reconstruction and filtering modules executed on a farm of O(1000) CPUs built from commodity hardware. This paper presents the architecture of the CMS HLT, which integrates the CMS reconstruction framework in the online environment. The mechanisms to configure, control, and monitor the Filter Farm and the procedures to validate the filtering code within the DAQ environment are described.

  20. Low-Frequency Synonymous Coding Variation in CYP2R1 Has Large Effects on Vitamin D Levels and Risk of Multiple Sclerosis.

    Science.gov (United States)

    Manousaki, Despoina; Dudding, Tom; Haworth, Simon; Hsu, Yi-Hsiang; Liu, Ching-Ti; Medina-Gómez, Carolina; Voortman, Trudy; van der Velde, Nathalie; Melhus, Håkan; Robinson-Cohen, Cassianne; Cousminer, Diana L; Nethander, Maria; Vandenput, Liesbeth; Noordam, Raymond; Forgetta, Vincenzo; Greenwood, Celia M T; Biggs, Mary L; Psaty, Bruce M; Rotter, Jerome I; Zemel, Babette S; Mitchell, Jonathan A; Taylor, Bruce; Lorentzon, Mattias; Karlsson, Magnus; Jaddoe, Vincent V W; Tiemeier, Henning; Campos-Obando, Natalia; Franco, Oscar H; Utterlinden, Andre G; Broer, Linda; van Schoor, Natasja M; Ham, Annelies C; Ikram, M Arfan; Karasik, David; de Mutsert, Renée; Rosendaal, Frits R; den Heijer, Martin; Wang, Thomas J; Lind, Lars; Orwoll, Eric S; Mook-Kanamori, Dennis O; Michaëlsson, Karl; Kestenbaum, Bryan; Ohlsson, Claes; Mellström, Dan; de Groot, Lisette C P G M; Grant, Struan F A; Kiel, Douglas P; Zillikens, M Carola; Rivadeneira, Fernando; Sawcer, Stephen; Timpson, Nicholas J; Richards, J Brent

    2017-08-03

    Vitamin D insufficiency is common, correctable, and influenced by genetic factors, and it has been associated with risk of several diseases. We sought to identify low-frequency genetic variants that strongly increase the risk of vitamin D insufficiency and tested their effect on risk of multiple sclerosis, a disease influenced by low vitamin D concentrations. We used whole-genome sequencing data from 2,619 individuals through the UK10K program and deep-imputation data from 39,655 individuals genotyped genome-wide. Meta-analysis of the summary statistics from 19 cohorts identified in CYP2R1 the low-frequency (minor allele frequency = 2.5%) synonymous coding variant g.14900931G>A (p.Asp120Asp) (rs117913124[A]), which conferred a large effect on 25-hydroxyvitamin D (25OHD) levels (-0.43 SD of standardized natural log-transformed 25OHD per A allele; p value = 1.5 × 10 -88 ). The effect on 25OHD was four times larger and independent of the effect of a previously described common variant near CYP2R1. By analyzing 8,711 individuals, we showed that heterozygote carriers of this low-frequency variant have an increased risk of vitamin D insufficiency (odds ratio [OR] = 2.2, 95% confidence interval [CI] = 1.78-2.78, p = 1.26 × 10 -12 ). Individuals carrying one copy of this variant also had increased odds of multiple sclerosis (OR = 1.4, 95% CI = 1.19-1.64, p = 2.63 × 10 -5 ) in a sample of 5,927 case and 5,599 control subjects. In conclusion, we describe a low-frequency CYP2R1 coding variant that exerts the largest effect upon 25OHD levels identified to date in the general European population and implicates vitamin D in the etiology of multiple sclerosis. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  1. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  2. LOLA SYSTEM: A code block for nodal PWR simulation. Part. II - MELON-3, CONCON and CONAXI Codes

    International Nuclear Information System (INIS)

    Aragones, J. M.; Ahnert, C.; Gomez Santamaria, J.; Rodriguez Olabarria, I.

    1985-01-01

    Description of the theory and users manual of the MELON-3, CONCON and CONAXI codes, which are part of the core calculation system by nodal theory in one group, called LOLA SYSTEM. These auxiliary codes, provide some of the input data for the main module SIMULA-3; these are, the reactivity correlations constants, the albe does and the transport factors. (Author) 7 refs

  3. Measurements of stimulated-Raman-scattering-induced tilt in spectral-amplitude-coding optical code-division multiple-access systems

    Science.gov (United States)

    Al-Qazwini, Zaineb A. T.; Abdullah, Mohamad K.; Mokhtar, Makhfudzah B.

    2009-01-01

    We measure the stimulated Raman scattering (SRS)-induced tilt in spectral-amplitude-coding optical code-division multiple-access (SAC-OCDMA) systems as a function of system main parameters (transmission distance, power per chip, and number of users) via computer simulations. The results show that SRS-induced tilt significantly increases as transmission distance, power per chip, or number of users grows.

  4. LOLA SYSTEM: A code block for nodal PWR simulation. Part. II - MELON-3, CONCON and CONAXI Codes

    Energy Technology Data Exchange (ETDEWEB)

    Aragones, J M; Ahnert, C; Gomez Santamaria, J; Rodriguez Olabarria, I

    1985-07-01

    Description of the theory and users manual of the MELON-3, CONCON and CONAXI codes, which are part of the core calculation system by nodal theory in one group, called LOLA SYSTEM. These auxiliary codes, provide some of the input data for the main module SIMULA-3; these are, the reactivity correlations constants, the albe does and the transport factors. (Author) 7 refs.

  5. RCC-E a Design Code for I and C and Electrical Systems

    International Nuclear Information System (INIS)

    Haure, J.M.

    2015-01-01

    The paper deals with the stakes and strength of the RCC-E code applicable to Electrical and Instrumentation and control systems and components as regards dealing with safety class functions. The document is interlacing specifications between Owners, safety authorities, designers, and suppliers IAEA safety guides and IEC standards. The code is periodically updated and published by French Society for Design and Construction rules for Nuclear Island Components (AFCEN). The code is compliant with third generation PWR nuclear islands and aims to suit with national regulations as needed in a companion document. The Feedback experience of Fukushima and the licensing of UKEPR in the framework of Generic Design Assessment are lessons learnt that should be considered in the upgrading of the code. The code gathers a set of requirements and relevant good practices of several PWR design and construction practices related to the electrical and I and C systems and components, and electrical engineering documents dealing with systems, equipment and layout designs. Comprehensive statement including some recent developments will be provided about: - Offsite and onsite sources requirements including sources dealing the total loss of off sites and main onsite sources. - Highlights of a relevant protection level against high frequencies disturbances emitted by lightning strokes, Interfaces data used by any supplier or designer such as site data, rooms temperature, equipment maximum design temperature, alternative current and direct current electrical network voltages and frequency variation ranges, environmental conditions decoupling data, - Environmental Qualification process including normal, mild (earthquake resistant), harsh and severe accident ambient conditions. A suit made approach based on families, which are defined as a combination of mission time, duration and abnormal conditions (pressure, temperature, radiation), enables to better cope with Environmental Qualifications

  6. Genetically determined angiotensin converting enzyme level and myocardial tolerance to ischemia

    OpenAIRE

    Messadi, Erij; Vincent, Marie-Pascale; Griol-Charhbili, Violaine; Mandet, Chantal; Colucci, Juliana; Krege, John H.; Bruneval, Patrick; Bouby, Nadine; Smithies, Oliver; Alhenc-Gelas, François; Richer, Christine

    2010-01-01

    Angiotensin I-converting enzyme (ACE; kininase II) levels in humans are genetically determined. ACE levels have been linked to risk of myocardial infarction, but the association has been inconsistent, and the causality underlying it remains undocumented. We tested the hypothesis that genetic variation in ACE levels influences myocardial tolerance to ischemia. We studied ischemia-reperfusion injury in mice bearing 1 (ACE1c), 2 (ACE2c, wild type), or 3 (ACE3c) functional copies of the ACE gene ...

  7. FAST: An advanced code system for fast reactor transient analysis

    International Nuclear Information System (INIS)

    Mikityuk, Konstantin; Pelloni, Sandro; Coddington, Paul; Bubelis, Evaldas; Chawla, Rakesh

    2005-01-01

    One of the main goals of the FAST project at PSI is to establish a unique analytical code capability for the core and safety analysis of advanced critical (and sub-critical) fast-spectrum systems for a wide range of different coolants. Both static and transient core physics, as well as the behaviour and safety of the power plant as a whole, are studied. The paper discusses the structure of the code system, including the organisation of the interfaces and data exchange. Examples of validation and application of the individual programs, as well as of the complete code system, are provided using studies carried out within the context of designs for experimental accelerator-driven, fast-spectrum systems

  8. MatLab script to C code converter for embedded processors of FLASH LLRF control system

    Science.gov (United States)

    Bujnowski, K.; Siemionczyk, A.; Pucyk, P.; Szewiński, J.; Pożniak, K. T.; Romaniuk, R. S.

    2008-01-01

    The low level RF control system (LLRF) of FEL serves for stabilization of the electromagnetic (EM) field in the superconducting niobium, resonant, microwave cavities and for controlling high power (MW) klystron. LLRF system of FLASH accelerator bases on FPGA technology and embedded microprocessors. Basic and auxiliary functions of the systems are listed as well as used algorithms for superconductive cavity parameters identification. These algorithms were prepared originally in Matlab. The main part of the paper presents implementation of the cavity parameters identification algorithm in a PowerPC processor embedded in the FPGA circuit VirtexIIPro. A construction of a very compact Matlab script converter to C code was presented, referred to as M2C. The application is designed specifically for embedded systems of very confined resources. The generated code is optimized for the weight. The code should be transferable between different hardware platforms. The converter generates a code for Linux and for stand-alone applications. Functional structure of the program was described and the way it is acting. FLEX and BIZON tools were used for construction of the converter. The paper concludes with an example of the M2C application to convert a complex identification algorithm for superconductive cavities in FLASH laser.

  9. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    Science.gov (United States)

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Pictorial AR Tag with Hidden Multi-Level Bar-Code and Its Potential Applications

    Directory of Open Access Journals (Sweden)

    Huy Le

    2017-09-01

    Full Text Available For decades, researchers have been trying to create intuitive virtual environments by blending reality and virtual reality, thus enabling general users to interact with the digital domain as easily as with the real world. The result is “augmented reality” (AR. AR seamlessly superimposes virtual objects on to a real environment in three dimensions (3D and in real time. One of the most important parts that helps close the gap between virtuality and reality is the marker used in the AR system. While pictorial marker and bar-code marker are the two most commonly used marker types in the market, they have some disadvantages in visual and processing performance. In this paper, we present a novelty method that combines the bar-code with the original feature of a colour picture (e.g., photos, trading cards, advertisement’s figure. Our method decorates on top of the original pictorial images additional features with a single stereogram image that optically conceals a multi-level (3D bar-code. Thus, it has a larger capability of storing data compared to the general 1D barcode. This new type of marker has the potential of addressing the issues that the current types of marker are facing. It not only keeps the original information of the picture but also contains encoded numeric information. In our limited evaluation, this pictorial bar-code shows a relatively robust performance under various conditions and scaling; thus, it provides a promising AR approach to be used in many applications such as trading card games, educations, and advertisements.

  11. A fuzzy genetic approach for network reconfiguration to enhance voltage stability in radial distribution systems

    International Nuclear Information System (INIS)

    Sahoo, N.C.; Prasad, K.

    2006-01-01

    This paper presents a fuzzy genetic approach for reconfiguration of radial distribution systems (RDS) so as to maximize the voltage stability of the network for a specific set of loads. The network reconfiguration involves a mechanism for selection of the best set of branches to be opened, one from each loop, such that the reconfigured RDS possesses desired performance characteristics. This discrete solution space is better handled by the proposed scheme, which maximizes a suitable optimizing function (computed using two different approaches). In the first approach, this function is chosen as the average of a voltage stability index of all the buses in the RDS, while in the second approach, the complete RDS is reduced to a two bus equivalent system and the optimizing function is the voltage stability index of this reduced two bus system. The fuzzy genetic algorithm uses a suitable coding and decoding scheme for maintaining the radial nature of the network at every stage of genetic evolution, and it also uses a fuzzy rule based mutation controller for efficient search of the solution space. This method, tested on 69 bus and 33 bus RDSs, shows promising results for the both approaches. It is also observed that the network losses are reduced when the voltage stability is enhanced by the network reconfiguration

  12. Learning of spatio-temporal codes in a coupled oscillator system.

    Science.gov (United States)

    Orosz, Gábor; Ashwin, Peter; Townley, Stuart

    2009-07-01

    In this paper, we consider a learning strategy that allows one to transmit information between two coupled phase oscillator systems (called teaching and learning systems) via frequency adaptation. The dynamics of these systems can be modeled with reference to a number of partially synchronized cluster states and transitions between them. Forcing the teaching system by steady but spatially nonhomogeneous inputs produces cyclic sequences of transitions between the cluster states, that is, information about inputs is encoded via a "winnerless competition" process into spatio-temporal codes. The large variety of codes can be learned by the learning system that adapts its frequencies to those of the teaching system. We visualize the dynamics using "weighted order parameters (WOPs)" that are analogous to "local field potentials" in neural systems. Since spatio-temporal coding is a mechanism that appears in olfactory systems, the developed learning rules may help to extract information from these neural ensembles.

  13. Genetic Programming for Sea Level Predictions in an Island Environment

    Directory of Open Access Journals (Sweden)

    M.A. Ghorbani

    2010-03-01

    Full Text Available Accurate predictions of sea-level are important for geodetic applications, navigation, coastal, industrial and tourist activities. In the current work, the Genetic Programming (GP and artificial neural networks (ANNs were applied to forecast half-daily and daily sea-level variations from 12 hours to 5 days ahead. The measurements at the Cocos (Keeling Islands in the Indian Ocean were used for training and testing of the employed artificial intelligence techniques. A comparison was performed of the predictions from the GP model and the ANN simulations. Based on the comparison outcomes, it was found that the Genetic Programming approach can be successfully employed in forecasting of sea level variations.

  14. Control code for laboratory adaptive optics teaching system

    Science.gov (United States)

    Jin, Moonseob; Luder, Ryan; Sanchez, Lucas; Hart, Michael

    2017-09-01

    By sensing and compensating wavefront aberration, adaptive optics (AO) systems have proven themselves crucial in large astronomical telescopes, retinal imaging, and holographic coherent imaging. Commercial AO systems for laboratory use are now available in the market. One such is the ThorLabs AO kit built around a Boston Micromachines deformable mirror. However, there are limitations in applying these systems to research and pedagogical projects since the software is written with limited flexibility. In this paper, we describe a MATLAB-based software suite to interface with the ThorLabs AO kit by using the MATLAB Engine API and Visual Studio. The software is designed to offer complete access to the wavefront sensor data, through the various levels of processing, to the command signals to the deformable mirror and fast steering mirror. In this way, through a MATLAB GUI, an operator can experiment with every aspect of the AO system's functioning. This is particularly valuable for tests of new control algorithms as well as to support student engagement in an academic environment. We plan to make the code freely available to the community.

  15. Management of genetic resources in the nursery system of wild cherry (Prunus avium L.

    Directory of Open Access Journals (Sweden)

    Proietti R

    2006-01-01

    Full Text Available Knowledge of genetic and adaptive traits of reproductive materials used in the nursery system of wild cherry, could be an useful instrument to improve ecological and economic sustainability of plantation ecosystems. This work reports results from a research which the objectives were: 1 to study the genetic variation of a Prunus avium L. Population, used for seed harvesting, through its multi-locus genotypes detected by starch gel electrophoresis; 2 to analyze the level of genetic variation within and among different steps in a commercial nursery system (basic population and sub-populations, seedlings aged S1T1 and S1T2, plantation. Results showed low genetic variation levels of the basic population, similar to a reference system of other 12 wild cherry Italian populations and to other French and Caucasian materials. The genetic distances among Monte Baldo and some closer Lombardy provenances (Area Garda, Bosco Fontana, Valtellina were smaller than the Venice Region populations (Monti Lessini and Asiago. Number of alleles and percentage of polymorphic loci within the complex of Monte Baldo provenance and multiplication materials were similar, whilst a variable value of Fis was noted. Indeed, along with the nursery system until the plantation, heterozygosis initially (S1T1 increased, then decreased proceeding to the plantation. This fluctuation of FIS values could be determined by seed lots characterized initially by higher levels of variation, due to self-incompatibility. In the following steps, a possible selection pressure can affect randomly the genotypic structure of wild cherry by increasing the homozygosity. There is not among population a well defined geographic characterization, as suggested by genetic distances, therefore homogeneous seed harvest could be established an area larger than geographic and administrative borders. On this way we could have reproductive material with a wide genetic base and environmental adaptability. To

  16. Mercure IV code application to the external dose computation from low and medium level wastes

    International Nuclear Information System (INIS)

    Tomassini, T.

    1985-01-01

    In the present work the external dose from low and medium level wastes is calculated using MERCURE IV code. The code utilizes MONTECARLO method for integrating multigroup line of sight attenuation Kernels

  17. Food Image Recognition via Superpixel Based Low-Level and Mid-Level Distance Coding for Smart Home Applications

    Directory of Open Access Journals (Sweden)

    Jiannan Zheng

    2017-05-01

    Full Text Available Food image recognition is a key enabler for many smart home applications such as smart kitchen and smart personal nutrition log. In order to improve living experience and life quality, smart home systems collect valuable insights of users’ preferences, nutrition intake and health conditions via accurate and robust food image recognition. In addition, efficiency is also a major concern since many smart home applications are deployed on mobile devices where high-end GPUs are not available. In this paper, we investigate compact and efficient food image recognition methods, namely low-level and mid-level approaches. Considering the real application scenario where only limited and noisy data are available, we first proposed a superpixel based Linear Distance Coding (LDC framework where distinctive low-level food image features are extracted to improve performance. On a challenging small food image dataset where only 12 training images are available per category, our framework has shown superior performance in both accuracy and robustness. In addition, to better model deformable food part distribution, we extend LDC’s feature-to-class distance idea and propose a mid-level superpixel food parts-to-class distance mining framework. The proposed framework show superior performance on a benchmark food image datasets compared to other low-level and mid-level approaches in the literature.

  18. Identity recognition in response to different levels of genetic relatedness in commercial soya bean

    Science.gov (United States)

    Van Acker, Rene; Rajcan, Istvan; Swanton, Clarence J.

    2017-01-01

    Identity recognition systems allow plants to tailor competitive phenotypes in response to the genetic relatedness of neighbours. There is limited evidence for the existence of recognition systems in crop species and whether they operate at a level that would allow for identification of different degrees of relatedness. Here, we test the responses of commercial soya bean cultivars to neighbours of varying genetic relatedness consisting of other commercial cultivars (intraspecific), its wild progenitor Glycine soja, and another leguminous species Phaseolus vulgaris (interspecific). We found, for the first time to our knowledge, that a commercial soya bean cultivar, OAC Wallace, showed identity recognition responses to neighbours at different levels of genetic relatedness. OAC Wallace showed no response when grown with other commercial soya bean cultivars (intra-specific neighbours), showed increased allocation to leaves compared with stems with wild soya beans (highly related wild progenitor species), and increased allocation to leaves compared with stems and roots with white beans (interspecific neighbours). Wild soya bean also responded to identity recognition but these responses involved changes in biomass allocation towards stems instead of leaves suggesting that identity recognition responses are species-specific and consistent with the ecology of the species. In conclusion, elucidating identity recognition in crops may provide further knowledge into mechanisms of crop competition and the relationship between crop density and yield. PMID:28280587

  19. Analysis of the KUCA MEU experiments using the ANL code system

    Energy Technology Data Exchange (ETDEWEB)

    Shiroya, S.; Hayashi, M.; Kanda, K.; Shibata, T.; Woodruff, W.L.; Matos, J.E.

    1982-01-01

    This paper provides some preliminary results on the analysis of the KUCA critical experiments using the ANL code system. Since this system was employed in the earlier neutronics calculations for the KUHFR, it is important to assess its capabilities for the KUHFR. The KUHFR has a unique core configuration which is difficult to model precisely with current diffusion theory codes. This paper also provides some results from a finite-element diffusion code (2D-FEM-KUR), which was developed in a cooperative research program between KURRI and JAERI. This code provides the capability for mockup of a complex core configuration as the KUHFR. Using the same group constants generated by the EPRI-CELL code, the results of the 2D-FEM-KUR code are compared with the finite difference diffusion code (DIF3D(2D) which is mainly employed in this analysis.

  20. Analysis of the KUCA MEU experiments using the ANL code system

    International Nuclear Information System (INIS)

    Shiroya, S.; Hayashi, M.; Kanda, K.; Shibata, T.; Woodruff, W.L.; Matos, J.E.

    1982-01-01

    This paper provides some preliminary results on the analysis of the KUCA critical experiments using the ANL code system. Since this system was employed in the earlier neutronics calculations for the KUHFR, it is important to assess its capabilities for the KUHFR. The KUHFR has a unique core configuration which is difficult to model precisely with current diffusion theory codes. This paper also provides some results from a finite-element diffusion code (2D-FEM-KUR), which was developed in a cooperative research program between KURRI and JAERI. This code provides the capability for mockup of a complex core configuration as the KUHFR. Using the same group constants generated by the EPRI-CELL code, the results of the 2D-FEM-KUR code are compared with the finite difference diffusion code (DIF3D(2D) which is mainly employed in this analysis

  1. Development of the versatile reactor analysis code system, MARBLE2

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Jin, Tomoyuki; Hazama, Taira; Hirai, Yasushi

    2015-07-01

    The second version of the versatile reactor analysis code system, MARBLE2, has been developed. A lot of new functions have been added in MARBLE2 by using the base technology developed in the first version (MARBLE1). Introducing the remaining functions of the conventional code system (JOINT-FR and SAGEP-FR), MARBLE2 enables one to execute almost all analysis functions of the conventional code system with the unified user interfaces of its subsystem, SCHEME. In particular, the sensitivity analysis functionality is available in MARBLE2. On the other hand, new built-in solvers have been developed, and existing ones have been upgraded. Furthermore, some other analysis codes and libraries developed in JAEA have been consolidated and prepared in SCHEME. In addition, several analysis codes developed in the other institutes have been additionally introduced as plug-in solvers. Consequently, gamma-ray transport calculation and heating evaluation become available. As for another subsystem, ORPHEUS, various functionality updates and speed-up techniques have been applied based on user experience of MARBLE1 to enhance its usability. (author)

  2. Genetic Local Search for Optimum Multiuser Detection Problem in DS-CDMA Systems

    Science.gov (United States)

    Wang, Shaowei; Ji, Xiaoyong

    Optimum multiuser detection (OMD) in direct-sequence code-division multiple access (DS-CDMA) systems is an NP-complete problem. In this paper, we present a genetic local search algorithm, which consists of an evolution strategy framework and a local improvement procedure. The evolution strategy searches the space of feasible, locally optimal solutions only. A fast iterated local search algorithm, which employs the proprietary characteristics of the OMD problem, produces local optima with great efficiency. Computer simulations show the bit error rate (BER) performance of the GLS outperforms other multiuser detectors in all cases discussed. The computation time is polynomial complexity in the number of users.

  3. High-Speed Turbo-TCM-Coded Orthogonal Frequency-Division Multiplexing Ultra-Wideband Systems

    Directory of Open Access Journals (Sweden)

    Wang Yanxia

    2006-01-01

    Full Text Available One of the UWB proposals in the IEEE P802.15 WPAN project is to use a multiband orthogonal frequency-division multiplexing (OFDM system and punctured convolutional codes for UWB channels supporting a data rate up to 480 Mbps. In this paper, we improve the proposed system using turbo TCM with QAM constellation for higher data rate transmission. We construct a punctured parity-concatenated trellis codes, in which a TCM code is used as the inner code and a simple parity-check code is employed as the outer code. The result shows that the system can offer a much higher spectral efficiency, for example, 1.2 Gbps, which is 2.5 times higher than the proposed system. We identify several essential requirements to achieve the high rate transmission, for example, frequency and time diversity and multilevel error protection. Results are confirmed by density evolution.

  4. Levels of Evidence: Cancer Genetics Studies (PDQ®)—Health Professional Version

    Science.gov (United States)

    Levels of Evidence for Cancer Genetics Studies addresses the process and challenges of developing evidence-based summaries. Get information about how to weigh the strength of the evidence from cancer genetics studies in this summary for clinicians.

  5. The research of atmospheric 2D optical PPM CDMA system with turbo coding

    Science.gov (United States)

    Zhou, Xiuli; Li, Zaoxia

    2007-11-01

    The atmospheric two-dimensional optical code-division multiple-access (CDMA) systems using pulse-position modulation (PPM) and Turbo-coded were presented. We analyzed the bit-error rate (BER) of the proposed system using pulse-position modulation (PPM) with considering the effects of the scintillation, avalanche photodiode noise, thermal noise, and multi-user interference. We showed that the atmospheric two dimensional (2D) optical PPM CDMA systems can realize high-speed communications when the logarithm variance of the scintillation is less than 0.1, and the turbo-coded atmospheric optical CDMA system has better bit error rate(BER) performance than the atmospheric optical PPM CDMA systems without turbo-coded. We also showed that the turbo-coded system has better performance than the multi-user detection system.

  6. HYDROCOIN [HYDROlogic COde INtercomparison] Level 1: Benchmarking and verification test results with CFEST [Coupled Fluid, Energy, and Solute Transport] code: Draft report

    International Nuclear Information System (INIS)

    Yabusaki, S.; Cole, C.; Monti, A.M.; Gupta, S.K.

    1987-04-01

    Part of the safety analysis is evaluating groundwater flow through the repository and the host rock to the accessible environment by developing mathematical or analytical models and numerical computer codes describing the flow mechanisms. This need led to the establishment of an international project called HYDROCOIN (HYDROlogic COde INtercomparison) organized by the Swedish Nuclear Power Inspectorate, a forum for discussing techniques and strategies in subsurface hydrologic modeling. The major objective of the present effort, HYDROCOIN Level 1, is determining the numerical accuracy of the computer codes. The definition of each case includes the input parameters, the governing equations, the output specifications, and the format. The Coupled Fluid, Energy, and Solute Transport (CFEST) code was applied to solve cases 1, 2, 4, 5, and 7; the Finite Element Three-Dimensional Groundwater (FE3DGW) Flow Model was used to solve case 6. Case 3 has been ignored because unsaturated flow is not pertinent to SRP. This report presents the Level 1 results furnished by the project teams. The numerical accuracy of the codes is determined by (1) comparing the computational results with analytical solutions for cases that have analytical solutions (namely cases 1 and 4), and (2) intercomparing results from codes for cases which do not have analytical solutions (cases 2, 5, 6, and 7). Cases 1, 2, 6, and 7 relate to flow analyses, whereas cases 4 and 5 require nonlinear solutions. 7 refs., 71 figs., 9 tabs

  7. An RNA Phage Lab: MS2 in Walter Fiers' laboratory of molecular biology in Ghent, from genetic code to gene and genome, 1963-1976.

    Science.gov (United States)

    Pierrel, Jérôme

    2012-01-01

    The importance of viruses as model organisms is well-established in molecular biology and Max Delbrück's phage group set standards in the DNA phage field. In this paper, I argue that RNA phages, discovered in the 1960s, were also instrumental in the making of molecular biology. As part of experimental systems, RNA phages stood for messenger RNA (mRNA), genes and genome. RNA was thought to mediate information transfers between DNA and proteins. Furthermore, RNA was more manageable at the bench than DNA due to the availability of specific RNases, enzymes used as chemical tools to analyse RNA. Finally, RNA phages provided scientists with a pure source of mRNA to investigate the genetic code, genes and even a genome sequence. This paper focuses on Walter Fiers' laboratory at Ghent University (Belgium) and their work on the RNA phage MS2. When setting up his Laboratory of Molecular Biology, Fiers planned a comprehensive study of the virus with a strong emphasis on the issue of structure. In his lab, RNA sequencing, now a little-known technique, evolved gradually from a means to solve the genetic code, to a tool for completing the first genome sequence. Thus, I follow the research pathway of Fiers and his 'RNA phage lab' with their evolving experimental system from 1960 to the late 1970s. This study illuminates two decisive shifts in post-war biology: the emergence of molecular biology as a discipline in the 1960s in Europe and of genomics in the 1990s.

  8. Systems Level Dissection of Anaerobic Methane Cycling: Quantitative Measurements of Single Cell Ecophysiology, Genetic Mechanisms, and Microbial Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Orphan, Victoria [California Inst. of Technology (CalTech), Pasadena, CA (United States); Tyson, Gene [University of Queensland, Brisbane Australia; Meile, Christof [University of Georgia, Athens, Georgia; McGlynn, Shawn [California Inst. of Technology (CalTech), Pasadena, CA (United States); Yu, Hang [California Inst. of Technology (CalTech), Pasadena, CA (United States); Chadwick, Grayson [California Inst. of Technology (CalTech), Pasadena, CA (United States); Marlow, Jeffrey [California Inst. of Technology (CalTech), Pasadena, CA (United States); Trembath-Reichert, Elizabeth [California Inst. of Technology (CalTech), Pasadena, CA (United States); Dekas, Anne [California Inst. of Technology (CalTech), Pasadena, CA (United States); Hettich, Robert [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pan, Chongle [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ellisman, Mark [University of California San Diego; Hatzenpichler, Roland [California Inst. of Technology (CalTech), Pasadena, CA (United States); Skennerton, Connor [California Inst. of Technology (CalTech), Pasadena, CA (United States); Scheller, Silvan [California Inst. of Technology (CalTech), Pasadena, CA (United States)

    2017-12-25

    The global biological CH4 cycle is largely controlled through coordinated and often intimate microbial interactions between archaea and bacteria, the majority of which are still unknown or have been only cursorily identified. Members of the methanotrophic archaea, aka ‘ANME’, are believed to play a major role in the cycling of methane in anoxic environments coupled to sulfate, nitrate, and possibly iron and manganese oxides, frequently forming diverse physical and metabolic partnerships with a range of bacteria. The thermodynamic challenges overcome by the ANME and their bacterial partners and corresponding slow rates of growth are common characteristics in anaerobic ecosystems, and, in stark contrast to most cultured microorganisms, this type of energy and resource limited microbial lifestyle is likely the norm in the environment. While we have gained an in-depth systems level understanding of fast-growing, energy-replete microorganisms, comparatively little is known about the dynamics of cell respiration, growth, protein turnover, gene expression, and energy storage in the slow-growing microbial majority. These fundamental properties, combined with the observed metabolic and symbiotic versatility of methanotrophic ANME, make these cooperative microbial systems a relevant (albeit challenging) system to study and for which to develop and optimize culture-independent methodologies, which enable a systems-level understanding of microbial interactions and metabolic networks. We used an integrative systems biology approach to study anaerobic sediment microcosms and methane-oxidizing bioreactors and expanded our understanding of the methanotrophic ANME archaea, their interactions with physically-associated bacteria, ecophysiological characteristics, and underlying genetic basis for cooperative microbial methane-oxidation linked with different terminal electron acceptors. Our approach is inherently multi-disciplinary and multi-scaled, combining transcriptional and

  9. A new two dimensional spectral/spatial multi-diagonal code for noncoherent optical code division multiple access (OCDMA) systems

    Science.gov (United States)

    Kadhim, Rasim Azeez; Fadhil, Hilal Adnan; Aljunid, S. A.; Razalli, Mohamad Shahrazel

    2014-10-01

    A new two dimensional codes family, namely two dimensional multi-diagonal (2D-MD) codes, is proposed for spectral/spatial non-coherent OCDMA systems based on the one dimensional MD code. Since the MD code has the property of zero cross correlation, the proposed 2D-MD code also has this property. So that, the multi-access interference (MAI) is fully eliminated and the phase induced intensity noise (PIIN) is suppressed with the proposed code. Code performance is analyzed in terms of bit error rate (BER) while considering the effect of shot noise, PIIN, and thermal noise. The performance of the proposed code is compared with the related MD, modified quadratic congruence (MQC), two dimensional perfect difference (2D-PD) and two dimensional diluted perfect difference (2D-DPD) codes. The analytical and the simulation results reveal that the proposed 2D-MD code outperforms the other codes. Moreover, a large number of simultaneous users can be accommodated at low BER and high data rate.

  10. Noise genetics: inferring protein function by correlating phenotype with protein levels and localization in individual human cells.

    Directory of Open Access Journals (Sweden)

    Shlomit Farkash-Amar

    2014-03-01

    Full Text Available To understand gene function, genetic analysis uses large perturbations such as gene deletion, knockdown or over-expression. Large perturbations have drawbacks: they move the cell far from its normal working point, and can thus be masked by off-target effects or compensation by other genes. Here, we offer a complementary approach, called noise genetics. We use natural cell-cell variations in protein level and localization, and correlate them to the natural variations of the phenotype of the same cells. Observing these variations is made possible by recent advances in dynamic proteomics that allow measuring proteins over time in individual living cells. Using motility of human cancer cells as a model system, and time-lapse microscopy on 566 fluorescently tagged proteins, we found 74 candidate motility genes whose level or localization strongly correlate with motility in individual cells. We recovered 30 known motility genes, and validated several novel ones by mild knockdown experiments. Noise genetics can complement standard genetics for a variety of phenotypes.

  11. LORD: a phenotype-genotype semantically integrated biomedical data tool to support rare disease diagnosis coding in health information systems.

    Science.gov (United States)

    Choquet, Remy; Maaroufi, Meriem; Fonjallaz, Yannick; de Carrara, Albane; Vandenbussche, Pierre-Yves; Dhombres, Ferdinand; Landais, Paul

    Characterizing a rare disease diagnosis for a given patient is often made through expert's networks. It is a complex task that could evolve over time depending on the natural history of the disease and the evolution of the scientific knowledge. Most rare diseases have genetic causes and recent improvements of sequencing techniques contribute to the discovery of many new diseases every year. Diagnosis coding in the rare disease field requires data from multiple knowledge bases to be aggregated in order to offer the clinician a global information space from possible diagnosis to clinical signs (phenotypes) and known genetic mutations (genotype). Nowadays, the major barrier to the coding activity is the lack of consolidation of such information scattered in different thesaurus such as Orphanet, OMIM or HPO. The Linking Open data for Rare Diseases (LORD) web portal we developed stands as the first attempt to fill this gap by offering an integrated view of 8,400 rare diseases linked to more than 14,500 signs and 3,270 genes. The application provides a browsing feature to navigate through the relationships between diseases, signs and genes, and some Application Programming Interfaces to help its integration in health information systems in routine.

  12. QR-codes as a tool to increase physical activity level among school children during class hours

    DEFF Research Database (Denmark)

    Christensen, Jeanette Reffstrup; Kristensen, Allan; Bredahl, Thomas Viskum Gjelstrup

    the students physical activity level during class hours. Methods: A before-after study was used to examine 12 students physical activity level, measured with pedometers for six lessons. Three lessons of traditional teaching and three lessons, where QR-codes were used to make orienteering in school area...... as old fashioned. The students also felt positive about being physically active in teaching. Discussion and conclusion: QR-codes as a tool for teaching are usable for making students more physically active in teaching. The students were exited for using QR-codes and they experienced a good motivation......QR-codes as a tool to increase physical activity level among school children during class hours Introduction: Danish students are no longer fulfilling recommendations for everyday physical activity. Since August 2014, Danish students in public schools are therefore required to be physically active...

  13. Cardinality enhancement utilizing Sequential Algorithm (SeQ code in OCDMA system

    Directory of Open Access Journals (Sweden)

    Fazlina C. A. S.

    2017-01-01

    Full Text Available Optical Code Division Multiple Access (OCDMA has been important with increasing demand for high capacity and speed for communication in optical networks because of OCDMA technique high efficiency that can be achieved, hence fibre bandwidth is fully used. In this paper we will focus on Sequential Algorithm (SeQ code with AND detection technique using Optisystem design tool. The result revealed SeQ code capable to eliminate Multiple Access Interference (MAI and improve Bit Error Rate (BER, Phase Induced Intensity Noise (PIIN and orthogonally between users in the system. From the results, SeQ shows good performance of BER and capable to accommodate 190 numbers of simultaneous users contrast with existing code. Thus, SeQ code have enhanced the system about 36% and 111% of FCC and DCS code. In addition, SeQ have good BER performance 10-25 at 155 Mbps in comparison with 622 Mbps, 1 Gbps and 2 Gbps bit rate. From the plot graph, 155 Mbps bit rate is suitable enough speed for FTTH and LAN networks. Resolution can be made based on the superior performance of SeQ code. Thus, these codes will give an opportunity in OCDMA system for better quality of service in an optical access network for future generation's usage

  14. Cardinality enhancement utilizing Sequential Algorithm (SeQ) code in OCDMA system

    Science.gov (United States)

    Fazlina, C. A. S.; Rashidi, C. B. M.; Rahman, A. K.; Aljunid, S. A.

    2017-11-01

    Optical Code Division Multiple Access (OCDMA) has been important with increasing demand for high capacity and speed for communication in optical networks because of OCDMA technique high efficiency that can be achieved, hence fibre bandwidth is fully used. In this paper we will focus on Sequential Algorithm (SeQ) code with AND detection technique using Optisystem design tool. The result revealed SeQ code capable to eliminate Multiple Access Interference (MAI) and improve Bit Error Rate (BER), Phase Induced Intensity Noise (PIIN) and orthogonally between users in the system. From the results, SeQ shows good performance of BER and capable to accommodate 190 numbers of simultaneous users contrast with existing code. Thus, SeQ code have enhanced the system about 36% and 111% of FCC and DCS code. In addition, SeQ have good BER performance 10-25 at 155 Mbps in comparison with 622 Mbps, 1 Gbps and 2 Gbps bit rate. From the plot graph, 155 Mbps bit rate is suitable enough speed for FTTH and LAN networks. Resolution can be made based on the superior performance of SeQ code. Thus, these codes will give an opportunity in OCDMA system for better quality of service in an optical access network for future generation's usage

  15. [Reverse genetics system of rotaviruses: development and application for analysis of VP4 spike protein].

    Science.gov (United States)

    Komoto, Satoshi

    2013-01-01

    The rotavirus genome is composed of 11 gene segments of double-stranded (ds)RNA. Reverse genetics is the powerful and ideal methodology for the molecular analysis of virus biology, which enables the virus genome to be artificially manipulated. Although reverse genetics systems exist for nearly all major groups of RNA viruses, development of such a system for rotaviruses is more challenging owing in part to the technical complexity of manipulation of their multi-segmented genome. A breakthrough in the field of rotavirus reverse genetics came in 2006, when we established the first reverse genetics system for rotaviruses, which is a partially plasmid-based system that permits replacement of a viral gene segment with the aid of a helper virus. Although this helper virus-driven system is technically limited and gives low levels of recombinant viruses, it allows alteration of the rotavirus genome, thus contributing to our understanding of these medically important viruses. In this review, I describe the development and application of our rotavirus reverse genetics system, and its future perspectives.

  16. Code for calculation of spreading of radioactivity in reactor containment systems

    International Nuclear Information System (INIS)

    Vertes, P.

    1992-09-01

    A detailed description of the new version of TIBSO code is given, with applications for accident analysis in a reactor containment system. The TIBSO code can follow the nuclear transition and the spatial migration of radioactive materials. The modelling of such processes is established in a very flexible way enabling the user to investigate a wide range of problems. The TIBSO code system is described in detail, taking into account the new developments since 1983. Most changes improve the capabilities of the code. The new version of TIBSO system is written in FORTRAN-77 and can be operated both under VAX VMS and PC DOS. (author) 5 refs.; 3 figs.; 21 tabs

  17. The computer code system for reactor radiation shielding in design of nuclear power plant

    International Nuclear Information System (INIS)

    Li Chunhuai; Fu Shouxin; Liu Guilian

    1995-01-01

    The computer code system used in reactor radiation shielding design of nuclear power plant includes the source term codes, discrete ordinate transport codes, Monte Carlo and Albedo Monte Carlo codes, kernel integration codes, optimization code, temperature field code, skyshine code, coupling calculation codes and some processing codes for data libraries. This computer code system has more satisfactory variety of codes and complete sets of data library. It is widely used in reactor radiation shielding design and safety analysis of nuclear power plant and other nuclear facilities

  18. LOLA SYSTEM: A code block for nodal PWR simulation. Part. I - Simula-3 Code

    Energy Technology Data Exchange (ETDEWEB)

    Aragones, J M; Ahnert, C; Gomez Santamaria, J; Rodriguez Olabarria, I

    1985-07-01

    Description of the theory and users manual of the SIMULA-3 code, which is part of the core calculation system by nodal theory in one group, called LOLA SYSTEM. SIMULA-3 is the main module of the system, it uses a modified nodal theory, with interface leakages equivalent to the diffusion theory. (Author) 4 refs.

  19. LOLA SYSTEM: A code block for nodal PWR simulation. Part. I - Simula-3 Code

    International Nuclear Information System (INIS)

    Aragones, J. M.; Ahnert, C.; Gomez Santamaria, J.; Rodriguez Olabarria, I.

    1985-01-01

    Description of the theory and users manual of the SIMULA-3 code, which is part of the core calculation system by nodal theory in one group, called LOLA SYSTEM. SIMULA-3 is the main module of the system, it uses a modified nodal theory, with interface leakages equivalent to the diffusion theory. (Author) 4 refs

  20. Dynamic detection technology of malicious code for Android system

    Directory of Open Access Journals (Sweden)

    Li Boya

    2017-02-01

    Full Text Available With the increasing popularization of mobile phones,people's dependence on them is rising,the security problems become more and more prominent.According to the calling of the APK file permission and the API function in Android system,this paper proposes a dynamic detecting method based on API interception technology to detect the malicious code.The experimental results show that this method can effectively detect the malicious code in Android system.

  1. Promoter Analysis Reveals Globally Differential Regulation of Human Long Non-Coding RNA and Protein-Coding Genes

    KAUST Repository

    Alam, Tanvir

    2014-10-02

    Transcriptional regulation of protein-coding genes is increasingly well-understood on a global scale, yet no comparable information exists for long non-coding RNA (lncRNA) genes, which were recently recognized to be as numerous as protein-coding genes in mammalian genomes. We performed a genome-wide comparative analysis of the promoters of human lncRNA and protein-coding genes, finding global differences in specific genetic and epigenetic features relevant to transcriptional regulation. These two groups of genes are hence subject to separate transcriptional regulatory programs, including distinct transcription factor (TF) proteins that significantly favor lncRNA, rather than coding-gene, promoters. We report a specific signature of promoter-proximal transcriptional regulation of lncRNA genes, including several distinct transcription factor binding sites (TFBS). Experimental DNase I hypersensitive site profiles are consistent with active configurations of these lncRNA TFBS sets in diverse human cell types. TFBS ChIP-seq datasets confirm the binding events that we predicted using computational approaches for a subset of factors. For several TFs known to be directly regulated by lncRNAs, we find that their putative TFBSs are enriched at lncRNA promoters, suggesting that the TFs and the lncRNAs may participate in a bidirectional feedback loop regulatory network. Accordingly, cells may be able to modulate lncRNA expression levels independently of mRNA levels via distinct regulatory pathways. Our results also raise the possibility that, given the historical reliance on protein-coding gene catalogs to define the chromatin states of active promoters, a revision of these chromatin signature profiles to incorporate expressed lncRNA genes is warranted in the future.

  2. User effects on the transient system code calculations. Final report

    International Nuclear Information System (INIS)

    Aksan, S.N.; D'Auria, F.

    1995-01-01

    Large thermal-hydraulic system codes are widely used to perform safety and licensing analyses of nuclear power plants to optimize operational procedures and the plant design itself. Evaluation of the capabilities of these codes are accomplished by comparing the code predictions with the measured experimental data obtained from various types of separate effects and integral test facilities. In recent years, some attempts have been made to establish methodologies to evaluate the accuracy and the uncertainty of the code predictions and consequently judgement on the acceptability of the codes. In none of the methodologies has the influence of the code user on the calculated results been directly addressed. In this paper, the results of the investigations on the user effects for the thermal-hydraulic transient system codes is presented and discussed on the basis of some case studies. The general findings of the investigations show that in addition to user effects, there are other reasons that affect the results of the calculations and which are hidden under user effects. Both the hidden factors and the direct user effects are discussed in detail and general recommendations and conclusions are presented to control and limit them

  3. Achievable Performance of Zero-Delay Variable-Rate Coding in Rate-Constrained Networked Control Systems with Channel Delay

    DEFF Research Database (Denmark)

    Barforooshan, Mohsen; Østergaard, Jan; Stavrou, Fotios

    2017-01-01

    This paper presents an upper bound on the minimum data rate required to achieve a prescribed closed-loop performance level in networked control systems (NCSs). The considered feedback loop includes a linear time-invariant (LTI) plant with single measurement output and single control input. Moreover......, in this NCS, a causal but otherwise unconstrained feedback system carries out zero-delay variable-rate coding, and control. Between the encoder and decoder, data is exchanged over a rate-limited noiseless digital channel with a known constant time delay. Here we propose a linear source-coding scheme...

  4. Catecholaminergic systems in stress: structural and molecular genetic approaches.

    Science.gov (United States)

    Kvetnansky, Richard; Sabban, Esther L; Palkovits, Miklos

    2009-04-01

    Stressful stimuli evoke complex endocrine, autonomic, and behavioral responses that are extremely variable and specific depending on the type and nature of the stressors. We first provide a short overview of physiology, biochemistry, and molecular genetics of sympatho-adrenomedullary, sympatho-neural, and brain catecholaminergic systems. Important processes of catecholamine biosynthesis, storage, release, secretion, uptake, reuptake, degradation, and transporters in acutely or chronically stressed organisms are described. We emphasize the structural variability of catecholamine systems and the molecular genetics of enzymes involved in biosynthesis and degradation of catecholamines and transporters. Characterization of enzyme gene promoters, transcriptional and posttranscriptional mechanisms, transcription factors, gene expression and protein translation, as well as different phases of stress-activated transcription and quantitative determination of mRNA levels in stressed organisms are discussed. Data from catecholamine enzyme gene knockout mice are shown. Interaction of catecholaminergic systems with other neurotransmitter and hormonal systems are discussed. We describe the effects of homotypic and heterotypic stressors, adaptation and maladaptation of the organism, and the specificity of stressors (physical, emotional, metabolic, etc.) on activation of catecholaminergic systems at all levels from plasma catecholamines to gene expression of catecholamine enzymes. We also discuss cross-adaptation and the effect of novel heterotypic stressors on organisms adapted to long-term monotypic stressors. The extra-adrenal nonneuronal adrenergic system is described. Stress-related central neuronal regulatory circuits and central organization of responses to various stressors are presented with selected examples of regulatory molecular mechanisms. Data summarized here indicate that catecholaminergic systems are activated in different ways following exposure to distinct

  5. A comparative evaluation of NDR and PSAR using the CASMO-3/MASTER code system

    International Nuclear Information System (INIS)

    Sim, Jeoung Hun; Kim, Han Gon

    2009-01-01

    In order to validate nuclear design data such as the nuclear design report (NDR) and data in preliminary (or final) safety analysis report (PSAR/FSAR) and to use data for the conceptual design of new plants, the CASMO-3/MASTER code system is selected as utility code. The nuclear design of OPR1000 and APR1400 is performed with the DIT/ROCS code system. In contrast with this design code system, the accuracy of CASMO- 3/MASTER code system has not been verified. Relatively little design data has been calculated by the CASMO-3/MASTER code system for OPR1000 and APR1400 and a bias system has not been developed yet. As such, validation of the performance of the CASMO- 3/MASTER code system is necessary. In order to validate the performance of the CASMO- 3/MASTER code system and to develop a calculation methodology, a comparative evaluation with NDR of Ulchin unit 4, cycle 1(U4C1) and the PSAR of Shinkori units 3 and 4 is carried out. The results of this evaluation are presented in this paper

  6. Developmental system at the crossroads of system theory, computer science, and genetic engineering

    CERN Document Server

    Węgrzyn, Stefan; Vidal, Pierre

    1990-01-01

    Many facts were at the origin of the present monograph. The ftrst is the beauty of maple leaves in Quebec forests in Fall. It raised the question: how does nature create and reproduce such beautiful patterns? The second was the reading of A. Lindenmayer's works on L systems. Finally came the discovery of "the secrets of DNA" together with many stimulating ex­ changes with biologists. Looking at such facts from the viewpoint of recursive numerical systems led to devise a simple model based on six elementary operations organized in a generating word, the analog of the program of a computer and of the genetic code of DNA in the cells of a living organism. It turned out that such a model, despite its simplicity, can account for a great number of properties of living organisms, e.g. their hierarchical structure, their ability to regenerate after a trauma, the possibility of cloning, their sensitivity to mutation, their growth, decay and reproduction. The model lends itself to analysis: the knowledge of the genera...

  7. Two-level mixed modeling of longitudinal pedigree data for genetic association analysis

    DEFF Research Database (Denmark)

    Tan, Q.

    2013-01-01

    of follow-up. Approaches have been proposed to integrate kinship correlation into the mixed effect models to explicitly model the genetic relationship which have been proven as an efficient way for dealing with sample clustering in pedigree data. Although useful for adjusting relatedness in the mixed...... assess the genetic associations with the mean level and the rate of change in a phenotype both with kinship correlation integrated in the mixed effect models. We apply our method to longitudinal pedigree data to estimate the genetic effects on systolic blood pressure measured over time in large pedigrees......Genetic association analysis on complex phenotypes under a longitudinal design involving pedigrees encounters the problem of correlation within pedigrees which could affect statistical assessment of the genetic effects on both the mean level of the phenotype and its rate of change over the time...

  8. Using Genetic Algorithms for Building Metrics of Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2011-01-01

    Full Text Available he paper objective is to reveal the importance of genetic algorithms in building robust metrics of collaborative systems. The main types of collaborative systems in economy are presented and some characteristics of genetic algorithms are described. A genetic algorithm was implemented in order to determine the local maximum and minimum points of the relative complexity function associated to a collaborative banking system. The intelligent collaborative systems based on genetic algorithms, representing the new generation of collaborative systems, are analyzed and the implementation of auto-adaptive interfaces in a banking application is described.

  9. Maximum Likelihood Blind Channel Estimation for Space-Time Coding Systems

    Directory of Open Access Journals (Sweden)

    Hakan A. Çırpan

    2002-05-01

    Full Text Available Sophisticated signal processing techniques have to be developed for capacity enhancement of future wireless communication systems. In recent years, space-time coding is proposed to provide significant capacity gains over the traditional communication systems in fading wireless channels. Space-time codes are obtained by combining channel coding, modulation, transmit diversity, and optional receive diversity in order to provide diversity at the receiver and coding gain without sacrificing the bandwidth. In this paper, we consider the problem of blind estimation of space-time coded signals along with the channel parameters. Both conditional and unconditional maximum likelihood approaches are developed and iterative solutions are proposed. The conditional maximum likelihood algorithm is based on iterative least squares with projection whereas the unconditional maximum likelihood approach is developed by means of finite state Markov process modelling. The performance analysis issues of the proposed methods are studied. Finally, some simulation results are presented.

  10. Conceptual Approach to Forming the Basic Code of Neo-Industrial Development of a Region

    Directory of Open Access Journals (Sweden)

    Elena Leonidovna Andreeva

    2017-09-01

    Full Text Available In the article, the authors propose the conceptual fundamentals of the “code approach” to the regional neo-industrial development. The purpose of the research is to reveal the essence of the transition to a new type of industrial and economic relations through a prism of “genetic codes” of the region. We consider these codes as a system of the “racial memory” of a territory, which determines the specificity and features of neo-industrialization realization. We substantiated the hypothesis about the influence of the “genetic codes” of the region on the effectiveness of the neo-industrialization. We have defined the participants, or else the carriers of the codes in the transformation of regional inheritance for the stimulation of the neoindustrial development of region’s economy. The subject matter of the research is the distinctive features of the functioning of the determinative region’s codes. Their content determines the socio-economic specificity of the region and the features of innovative, informational, value-based and competence-based development of the territory. The determinative codes generate the dynamic codes of the region, which are understood as their derivatives. They have a high probability of occurrence, higher speed of development and distribution, internal forces that make possible the self-development of the region. The scientific contribution is the substantiation of the basic code of the regional neo-industrial development. It represents the evolutionary accumulation of the rapid changes of its innovative, informational, value-based and competence-based codes stimulating the generation and implementation of new ideas regarding to economic entities adapted to the historical and cultural conditions. The article presents the code model of neo-industrial development of the region described by formulas. We applied the system analysis methods, historical and civilization approaches, evolutionary and

  11. A Monte Carlo neutron transport code for eigenvalue calculations on a dual-GPU system and CUDA environment

    Energy Technology Data Exchange (ETDEWEB)

    Liu, T.; Ding, A.; Ji, W.; Xu, X. G. [Nuclear Engineering and Engineering Physics, Rensselaer Polytechnic Inst., Troy, NY 12180 (United States); Carothers, C. D. [Dept. of Computer Science, Rensselaer Polytechnic Inst. RPI (United States); Brown, F. B. [Los Alamos National Laboratory (LANL) (United States)

    2012-07-01

    Monte Carlo (MC) method is able to accurately calculate eigenvalues in reactor analysis. Its lengthy computation time can be reduced by general-purpose computing on Graphics Processing Units (GPU), one of the latest parallel computing techniques under development. The method of porting a regular transport code to GPU is usually very straightforward due to the 'embarrassingly parallel' nature of MC code. However, the situation becomes different for eigenvalue calculation in that it will be performed on a generation-by-generation basis and the thread coordination should be explicitly taken care of. This paper presents our effort to develop such a GPU-based MC code in Compute Unified Device Architecture (CUDA) environment. The code is able to perform eigenvalue calculation under simple geometries on a multi-GPU system. The specifics of algorithm design, including thread organization and memory management were described in detail. The original CPU version of the code was tested on an Intel Xeon X5660 2.8 GHz CPU, and the adapted GPU version was tested on NVIDIA Tesla M2090 GPUs. Double-precision floating point format was used throughout the calculation. The result showed that a speedup of 7.0 and 33.3 were obtained for a bare spherical core and a binary slab system respectively. The speedup factor was further increased by a factor of {approx}2 on a dual GPU system. The upper limit of device-level parallelism was analyzed, and a possible method to enhance the thread-level parallelism was proposed. (authors)

  12. A Monte Carlo neutron transport code for eigenvalue calculations on a dual-GPU system and CUDA environment

    International Nuclear Information System (INIS)

    Liu, T.; Ding, A.; Ji, W.; Xu, X. G.; Carothers, C. D.; Brown, F. B.

    2012-01-01

    Monte Carlo (MC) method is able to accurately calculate eigenvalues in reactor analysis. Its lengthy computation time can be reduced by general-purpose computing on Graphics Processing Units (GPU), one of the latest parallel computing techniques under development. The method of porting a regular transport code to GPU is usually very straightforward due to the 'embarrassingly parallel' nature of MC code. However, the situation becomes different for eigenvalue calculation in that it will be performed on a generation-by-generation basis and the thread coordination should be explicitly taken care of. This paper presents our effort to develop such a GPU-based MC code in Compute Unified Device Architecture (CUDA) environment. The code is able to perform eigenvalue calculation under simple geometries on a multi-GPU system. The specifics of algorithm design, including thread organization and memory management were described in detail. The original CPU version of the code was tested on an Intel Xeon X5660 2.8 GHz CPU, and the adapted GPU version was tested on NVIDIA Tesla M2090 GPUs. Double-precision floating point format was used throughout the calculation. The result showed that a speedup of 7.0 and 33.3 were obtained for a bare spherical core and a binary slab system respectively. The speedup factor was further increased by a factor of ∼2 on a dual GPU system. The upper limit of device-level parallelism was analyzed, and a possible method to enhance the thread-level parallelism was proposed. (authors)

  13. Design evaluation on sodium piping system and comparison of the design codes

    International Nuclear Information System (INIS)

    Lee, Dong Won; Jeong, Ji Young; Lee, Yong Bum; Lee, Hyeong Yeon

    2015-01-01

    A large-scale sodium test loop of STELLA-1 (Sodium integral effect test loop for safety simulation and assessment) with two main piping systems has been installed at KAERI. In this study, design evaluations on the main sodium piping systems in STELLA-1 have been conducted according to the DBR (design by rule) codes of the ASME B31.1 and RCC-MRx RB-3600. In addition, design evaluations according to the DBA (design by analysis) code of the ASME Section III Subsection NB-3200 have been conducted. The evaluation results for the present piping systems showed that results from the DBR codes were more conservative than those from the DBA code, and among the DBR codes, the non-nuclear code of the ASME B31.1 was more conservative than the French nuclear DBR code of the RCC-MRx RB-3600. The conservatism on the DBR codes of the ASME B31.1 and RCC-MRx RB-3600 was quantified based on the present sodium piping analyses.

  14. Design evaluation on sodium piping system and comparison of the design codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won; Jeong, Ji Young; Lee, Yong Bum; Lee, Hyeong Yeon [KAERI, Daejeon (Korea, Republic of)

    2015-03-15

    A large-scale sodium test loop of STELLA-1 (Sodium integral effect test loop for safety simulation and assessment) with two main piping systems has been installed at KAERI. In this study, design evaluations on the main sodium piping systems in STELLA-1 have been conducted according to the DBR (design by rule) codes of the ASME B31.1 and RCC-MRx RB-3600. In addition, design evaluations according to the DBA (design by analysis) code of the ASME Section III Subsection NB-3200 have been conducted. The evaluation results for the present piping systems showed that results from the DBR codes were more conservative than those from the DBA code, and among the DBR codes, the non-nuclear code of the ASME B31.1 was more conservative than the French nuclear DBR code of the RCC-MRx RB-3600. The conservatism on the DBR codes of the ASME B31.1 and RCC-MRx RB-3600 was quantified based on the present sodium piping analyses.

  15. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  16. A systematic literature review of automated clinical coding and classification systems.

    Science.gov (United States)

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  17. A bar-code reader for an alpha-beta automatic counting system - FAG

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, S; Shemesh, Y; Ankry, N; Assido, H; German, U; Peled, O [Israel Atomic Energy Commission, Beersheba (Israel). Nuclear Research Center-Negev

    1996-12-01

    A bar-code laser system for sample number reading was integrated into the FAG Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. Installation of the bar-code reader system required several modifications: Mechanical changes in the automatic sample changer, design and production of new sample holders, modification of the sample planchettes, changes in the electronic system, update of the operating software of the system (authors).

  18. A bar-code reader for an alpha-beta automatic counting system - FAG

    International Nuclear Information System (INIS)

    Levinson, S.; Shemesh, Y.; Ankry, N.; Assido, H.; German, U.; Peled, O.

    1996-01-01

    A bar-code laser system for sample number reading was integrated into the FAG Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. Installation of the bar-code reader system required several modifications: Mechanical changes in the automatic sample changer, design and production of new sample holders, modification of the sample planchettes, changes in the electronic system, update of the operating software of the system (authors)

  19. Subchannel analysis of a boiloff experiment by a system thermalhydraulic code

    International Nuclear Information System (INIS)

    Bousbia-Salah, A.; D'Auria, F.

    2001-01-01

    This paper presents the results of system thermalhydraulic code using the sub-channel analysis approach in predicting the Neptun boil off experiments. This approach will be suitable for further works in view of coupling the system code with a 3D neutron kinetic one. The boil off tests were conducted in order to simulate the consequences of loss of coolant inventory leading to uncovery and heat up of fuel elements of a nuclear reactor core. In this framework, the Neptun low pressure test No5002, which is a good repeat experiment, is considered. The calculations were carried out using the system transient analysis code Relap5/Mod3.2. A detailed nodalization of the Neptun test section was developed. A reference case was run, and the overall data comparison shows good agreement between calculated and experimental thermalhydraulic parameters. A series of sensitivity analyses were also performed in order to assess the code prediction capabilities. The obtained results were almost satisfactory, this demonstrates, as well, the reasonable success of the subchannel analysis approach adopted in the present context for a system thermalhydraulic code.(author)

  20. Impact of Different Spreading Codes Using FEC on DWT Based MC-CDMA System

    OpenAIRE

    Masum, Saleh; Kabir, M. Hasnat; Islam, Md. Matiqul; Shams, Rifat Ara; Ullah, Shaikh Enayet

    2012-01-01

    The effect of different spreading codes in DWT based MC-CDMA wireless communication system is investigated. In this paper, we present the Bit Error Rate (BER) performance of different spreading codes (Walsh-Hadamard code, Orthogonal gold code and Golay complementary sequences) using Forward Error Correction (FEC) of the proposed system. The data is analyzed and is compared among different spreading codes in both coded and uncoded cases. It is found via computer simulation that the performance...

  1. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  2. Nifty Native Implemented Functions: low-level meets high-level code

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Erlang Native Implemented Functions (NIFs) allow developers to implement functions in C (or C++) rather than Erlang. NIFs are useful for integrating high performance or legacy code in Erlang applications. The talk will cover how to implement NIFs, use cases, and common pitfalls when employing them. Further, we will discuss how and why Erlang applications, such as Riak, use NIFs. About the speaker Ian Plosker is the Technical Lead, International Operations at Basho Technologies, the makers of the open source database Riak. He has been developing software professionally for 10 years and programming since childhood. Prior to working at Basho, he developed everything from CMS to bioinformatics platforms to corporate competitive intelligence management systems. At Basho, he's been helping customers be incredibly successful using Riak.

  3. Improvement of Level-1 PSA computer code package -A study for nuclear safety improvement-

    International Nuclear Information System (INIS)

    Park, Chang Kyu; Kim, Tae Woon; Ha, Jae Joo; Han, Sang Hoon; Cho, Yeong Kyun; Jeong, Won Dae; Jang, Seung Cheol; Choi, Young; Seong, Tae Yong; Kang, Dae Il; Hwang, Mi Jeong; Choi, Seon Yeong; An, Kwang Il

    1994-07-01

    This year is the second year of the Government-sponsored Mid- and Long-Term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The Improvement of Level-1 PSA Computer Codes' is divided into three main activities : (1) Methodology development on the under-developed fields such as risk assessment technology for plant shutdown and external events, (2) Computer code package development for Level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in the area of PSA methodology development, foreign PSA reports on shutdown and external events have been reviewed and various PSA methodologies have been compared. Level-1 PSA code KIRAP and CCF analysis code COCOA are converted from KOS to Windows. Human reliability database has been also established in this year. In the area of new technology applications, fuzzy set theory and entropy theory are used to estimate component life and to develop a new measure of uncertainty importance. Finally, in the field of application study of PSA technique to reactor regulation, a strategic study to develop a dynamic risk management tool PEPSI and the determination of inspection and test priority of motor operated valves based on risk importance worths have been studied. (Author)

  4. OSCAR-4 Code System Application to the SAFARI-1 Reactor

    International Nuclear Information System (INIS)

    Stander, Gerhardt; Prinsloo, Rian H.; Tomasevic, Djordje I.; Mueller, Erwin

    2008-01-01

    The OSCAR reactor calculation code system consists of a two-dimensional lattice code, the three-dimensional nodal core simulator code MGRAC and related service codes. The major difference between the new version of the OSCAR system, OSCAR-4, and its predecessor, OSCAR-3, is the new version of MGRAC which contains many new features and model enhancements. In this work some of the major improvements in the nodal diffusion solution method, history tracking, nuclide transmutation and cross section models are described. As part of the validation process of the OSCAR-4 code system (specifically the new MGRAC version), some of the new models are tested by comparing computational results to SAFARI-1 reactor plant data for a number of operational cycles and for varying applications. A specific application of the new features allows correct modeling of, amongst others, the movement of fuel-follower type control rods and dynamic in-core irradiation schedules. It is found that the effect of the improved control rod model, applied over multiple cycles of the SAFARI-1 reactor operation history, has a significant effect on in-cycle reactivity prediction and fuel depletion. (authors)

  5. Nonterminals, homomorphisms and codings in different variations of OL-systems. II. Nondeterministic systems

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Rozenberg, Grzegorz; Salomaa, Arto

    1974-01-01

    Continuing the work begun in Part I of this paper, we consider now variations of nondeterministic OL-systems. The present Part II of the paper contains a systematic classification of the effect of nonterminals, codings, weak codings, nonerasing homomorphisms and homomorphisms for all basic variat...

  6. Systemizers are better code-breakers:Self-reported systemizing predicts code-breaking performance in expert hackers and naïve participants

    Directory of Open Access Journals (Sweden)

    India eHarvey

    2016-05-01

    Full Text Available Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders - attention to detail and systemizing - may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e. crypto-analysis or code-breaking. A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001; Baron-Cohen et al., 2003. They were also tested with behavioural tasks involving code-breaking and a control task involving security x-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing was related with performance in the x-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015. We discuss the theoretical and translational implications of our findings.

  7. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  8. Radioactivities evaluation code system for high temperature gas cooled reactors during normal operation

    International Nuclear Information System (INIS)

    Ogura, Kenji; Morimoto, Toshio; Suzuki, Katsuo.

    1979-01-01

    A radioactivity evaluation code system for high temperature gas-cooled reactors during normal operation was developed to study the behavior of fission products (FP) in the plants. The system consists of a code for the calculation of diffusion of FPs in fuel (FIPERX), a code for the deposition of FPs in primary cooling system (PLATO), a code for the transfer and emission of FPs in nuclear power plants (FIPPI-2), and a code for the exposure dose due to emitted FPs (FEDOSE). The FIPERX code can calculate the changes in the course of time FP of the distribution of FP concentration, the distribution of FP flow, the distribution of FP partial pressure, and the emission rate of FP into coolant. The amount of deposition of FPs and their distribution in primary cooling system can be evaluated by the PLATO code. The FIPPI-2 code can be used for the estimation of the amount of FPs in nuclear power plants and the amount of emitted FPs from the plants. The exposure dose of residents around nuclear power plants in case of the operation of the plants is calculated by the FEDOSE code. This code evaluates the dose due to the external exposure in the normal operation and in the accident, and the internal dose by the inhalation of radioactive plume and foods. Further studies of this code system by the comparison with the experimental data are considered. (Kato, T.)

  9. PASC-1, Petten AMPX-II/SCALE-3 Code System for Reactor Neutronics Calculation

    International Nuclear Information System (INIS)

    Yaoqing, W.; Oppe, J.; Haas, J.B.M. de; Gruppelaar, H.; Slobben, J.

    1995-01-01

    1 - Description of program or function: The Petten AMPX-II/SCALE-3 Code System PASC-1 is a reactor neutronics calculation programme system consisting of well known IBM-oriented codes, that have been translated into FORTRAN-77, for calculations on a CDC-CYBER computer. Thus, the portability of these codes has been increased. In this system, some AMPX-II and SCALE-3 modules, the one-dimensional transport code ANISN and the 1 to 3-dimensional diffusion code CITATION are linked together on the CDC-CYBER/855 computer. The new cell code XSDRNPM-S and the old XSDRN code are included in the system. Starting from an AMPX fine group library up to CITATION, calculations can be performed for each individual module. Existing AMPX master interface format libraries, such as CSRL-IV, JEF-1, IRI and SCALE-45, and the old XSDRN-formatted libraries such as the COBB library can be used for the calculations. The code system contains the following modules and codes at present: AIM, AJAX, MALOCS, NITAWL-S, REVERT-I, ICE-2, CONVERT, JUAN, OCTAGN, XSDRNPM-S, XSDRN, ANISN and CITATION. The system will be extended with other SCALE modules and transport codes. 2 - Method of solution: The PASC-1 system is based on AMPX-II/SCALE-3 modules. Except for some SCALE-3 modules taken from the SCALIAS package, the original AMPX-II modules were IBM versions written in FORTRAN IV. These modules have been translated into CDC FORTRAN V. In order to test these modules and link them with some codes, some of the sample problem calculations have been performed for the whole PASC-1 system. During these calculations, some FORTRAN-77 errors were found in MALOCS, REVERT, CONVERT and some subroutines of SUBLIB (FORTRAN-77 subroutine library). These errors have been corrected. Because many corrections were made for the REVERT module, it is renamed as REVERT-I (improved version of REVERT). After these corrections, the whole system is running on a CDC-CYBER Computer (NOS-BE operating system). 3 - Restrictions on the

  10. Role of horizontal gene transfer as a control on the coevolution of ribosomal proteins and the genetic code

    Energy Technology Data Exchange (ETDEWEB)

    Woese, Carl R.; Goldenfeld, Nigel; Luthey-Schulten, Zaida

    2011-03-31

    Our main goal is to develop the conceptual and computational tools necessary to understand the evolution of the universal processes of translation and replication and to identify events of horizontal gene transfer that occurred within the components. We will attempt to uncover the major evolutionary transitions that accompanied the development of protein synthesis by the ribosome and associated components of the translation apparatus. Our project goes beyond standard genomic approaches to explore homologs that are represented at both the structure and sequence level. Accordingly, use of structural phylogenetic analysis allows us to probe further back into deep evolutionary time than competing approaches, permitting greater resolution of primitive folds and structures. Specifically, our work focuses on the elements of translation, ranging from the emergence of the canonical genetic code to the evolution of specific protein folds, mediated by the predominance of horizontal gene transfer in early life. A unique element of this study is the explicit accounting for the impact of phenotype selection on translation, through a coevolutionary control mechanism. Our work contributes to DOE mission objectives through: (1) sophisticated computer simulation of protein dynamics and evolution, and the further refinement of techniques for structural phylogeny, which complement sequence information, leading to improved annotation of genomic databases; (2) development of evolutionary approaches to exploring cellular function and machinery in an integrated way; and (3) documentation of the phenotype interaction with translation over evolutionary time, reflecting the system response to changing selection pressures through horizontal gene transfer.

  11. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  12. Crucial steps to life: From chemical reactions to code using agents.

    Science.gov (United States)

    Witzany, Guenther

    2016-02-01

    The concepts of the origin of the genetic code and the definitions of life changed dramatically after the RNA world hypothesis. Main narratives in molecular biology and genetics such as the "central dogma," "one gene one protein" and "non-coding DNA is junk" were falsified meanwhile. RNA moved from the transition intermediate molecule into centre stage. Additionally the abundance of empirical data concerning non-random genetic change operators such as the variety of mobile genetic elements, persistent viruses and defectives do not fit with the dominant narrative of error replication events (mutations) as being the main driving forces creating genetic novelty and diversity. The reductionistic and mechanistic views on physico-chemical properties of the genetic code are no longer convincing as appropriate descriptions of the abundance of non-random genetic content operators which are active in natural genetic engineering and natural genome editing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. LevelScheme: A level scheme drawing and scientific figure preparation system for Mathematica

    Science.gov (United States)

    Caprio, M. A.

    2005-09-01

    LevelScheme is a scientific figure preparation system for Mathematica. The main emphasis is upon the construction of level schemes, or level energy diagrams, as used in nuclear, atomic, molecular, and hadronic physics. LevelScheme also provides a general infrastructure for the preparation of publication-quality figures, including support for multipanel and inset plotting, customizable tick mark generation, and various drawing and labeling tasks. Coupled with Mathematica's plotting functions and powerful programming language, LevelScheme provides a flexible system for the creation of figures combining diagrams, mathematical plots, and data plots. Program summaryTitle of program:LevelScheme Catalogue identifier:ADVZ Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVZ Operating systems:Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux Programming language used:Mathematica 4 Number of bytes in distributed program, including test and documentation:3 051 807 Distribution format:tar.gz Nature of problem:Creation of level scheme diagrams. Creation of publication-quality multipart figures incorporating diagrams and plots. Method of solution:A set of Mathematica packages has been developed, providing a library of level scheme drawing objects, tools for figure construction and labeling, and control code for producing the graphics.

  14. Integrated analysis of genetic data with R

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2006-01-01

    Full Text Available Abstract Genetic data are now widely available. There is, however, an apparent lack of concerted effort to produce software systems for statistical analysis of genetic data compared with other fields of statistics. It is often a tremendous task for end-users to tailor them for particular data, especially when genetic data are analysed in conjunction with a large number of covariates. Here, R http://www.r-project.org, a free, flexible and platform-independent environment for statistical modelling and graphics is explored as an integrated system for genetic data analysis. An overview of some packages currently available for analysis of genetic data is given. This is followed by examples of package development and practical applications. With clear advantages in data management, graphics, statistical analysis, programming, internet capability and use of available codes, it is a feasible, although challenging, task to develop it into an integrated platform for genetic analysis; this will require the joint efforts of many researchers.

  15. Analog system for computing sparse codes

    Science.gov (United States)

    Rozell, Christopher John; Johnson, Don Herrick; Baraniuk, Richard Gordon; Olshausen, Bruno A.; Ortman, Robert Lowell

    2010-08-24

    A parallel dynamical system for computing sparse representations of data, i.e., where the data can be fully represented in terms of a small number of non-zero code elements, and for reconstructing compressively sensed images. The system is based on the principles of thresholding and local competition that solves a family of sparse approximation problems corresponding to various sparsity metrics. The system utilizes Locally Competitive Algorithms (LCAs), nodes in a population continually compete with neighboring units using (usually one-way) lateral inhibition to calculate coefficients representing an input in an over complete dictionary.

  16. Bistability in self-activating genes regulated by non-coding RNAs

    International Nuclear Information System (INIS)

    Miro-Bueno, Jesus

    2015-01-01

    Non-coding RNA molecules are able to regulate gene expression and play an essential role in cells. On the other hand, bistability is an important behaviour of genetic networks. Here, we propose and study an ODE model in order to show how non-coding RNA can produce bistability in a simple way. The model comprises a single gene with positive feedback that is repressed by non-coding RNA molecules. We show how the values of all the reaction rates involved in the model are able to control the transitions between the high and low states. This new model can be interesting to clarify the role of non-coding RNA molecules in genetic networks. As well, these results can be interesting in synthetic biology for developing new genetic memories and biomolecular devices based on non-coding RNAs

  17. APPC - A new standardised coding system for trans-organisational PACS retrieval

    International Nuclear Information System (INIS)

    Fruehwald, F.; Lindner, A.; Mostbeck, G.; Hruby, W.; Fruehwald-Pallamar, J.

    2010-01-01

    As part of a general strategy to integrate the health care enterprise, Austria plans to connect the Picture Archiving and Communication Systems (PACS) of all radiological institutions into a nationwide network. To facilitate the search for relevant correlative imaging data in the PACS of different organisations, a coding system was compiled for all radiological procedures and necessary anatomical details. This code, called the Austrian PACS Procedure Code (APPC), was granted the status of a standard under HL7. Examples are provided of effective coding and filtering when searching for relevant imaging material using the APPC, as well as the planned process for future adjustments of the APPC. The implementation and how the APPC will fit into the future electronic environment, which will include an electronic health act for all citizens in Austria, are discussed. A comparison to other nationwide electronic health record projects and coding systems is given. Limitations and possible use in physical storage media are contemplated. (orig.)

  18. Management-retrieval code system of fission barrier parameter sub-library

    International Nuclear Information System (INIS)

    Zhang Limin; Su Zongdi; Ge Zhigang

    1995-01-01

    The fission barrier parameter (FBP) library, which is a sub-library of Chinese Evaluated Nuclear Parameter library (CENPL), stores various popular used fission barrier parameters from different historical period, and could retrieve the required fission barrier parameters by using the management retrieval code system of the FBP sub-library. The function, feature and operation instruction of the code system are described briefly

  19. Fuel management and core design code systems for pressurized water reactor neutronic calculations

    International Nuclear Information System (INIS)

    Ahnert, C.; Arayones, J.M.

    1985-01-01

    A package of connected code systems for the neutronic calculations relevant in fuel management and core design has been developed and applied for validation to the startup tests and first operating cycle of a 900MW (electric) PWR. The package includes the MARIA code system for the modeling of the different types of PWR fuel assemblies, the CARMEN code system for detailed few group diffusion calculations for PWR cores at operating and burnup conditions, and the LOLA code system for core simulation using onegroup nodal theory parameters explicitly calculated from the detailed solutions

  20. Systemization of burnup sensitivity analysis code

    International Nuclear Information System (INIS)

    Tatsumi, Masahiro; Hyoudou, Hideaki

    2004-02-01

    To practical use of fact reactors, it is a very important subject to improve prediction accuracy for neutronic properties in LMFBR cores from the viewpoints of improvements on plant efficiency with rationally high performance cores and that on reliability and safety margins. A distinct improvement on accuracy in nuclear core design has been accomplished by development of adjusted nuclear library using the cross-section adjustment method, in which the results of critical experiments of JUPITER and so on are reflected. In the design of large LMFBR cores, however, it is important to accurately estimate not only neutronic characteristics, for example, reaction rate distribution and control rod worth but also burnup characteristics, for example, burnup reactivity loss, breeding ratio and so on. For this purpose, it is desired to improve prediction accuracy of burnup characteristics using the data widely obtained in actual core such as the experimental fast reactor core 'JOYO'. The analysis of burnup characteristics is needed to effectively use burnup characteristics data in the actual cores based on the cross-section adjustment method. So far, development of a analysis code for burnup sensitivity, SAGEP-BURN, has been done and confirmed its effectiveness. However, there is a problem that analysis sequence become inefficient because of a big burden to user due to complexity of the theory of burnup sensitivity and limitation of the system. It is also desired to rearrange the system for future revision since it is becoming difficult to implement new functionalities in the existing large system. It is not sufficient to unify each computational component for some reasons; computational sequence may be changed for each item being analyzed or for purpose such as interpretation of physical meaning. Therefore it is needed to systemize the current code for burnup sensitivity analysis with component blocks of functionality that can be divided or constructed on occasion. For this

  1. Tritium module for ITER/Tiber system code

    International Nuclear Information System (INIS)

    Finn, P.A.; Willms, S.; Busigin, A.; Kalyanam, K.M.

    1988-01-01

    A tritium module was developed for the ITER/Tiber system code to provide information on capital costs, tritium inventory, power requirements and building volumes for these systems. In the tritium module, the main tritium subsystems/emdash/plasma processing, atmospheric cleanup, water cleanup, blanket processing/emdash/are each represented by simple scaleable algorithms. 6 refs., 2 tabs

  2. Analysis of PPM-CDMA and OPPM-CDMA communication systems with new optical code

    Science.gov (United States)

    Liu, F.; Ghafouri-Shiraz, H.

    2005-11-01

    A novel type of optical spreading sequences, named the 'new-Modified Prime Code (nMPC)', is proposed for use in synchronous direct-detection optical code-division multiple-access (CDMA) systems which employ both pulse position modulation (PPM) and overlapping pulse position modulation (OPPM) schemes. The upper bounds on the bit error rate (BER) for nMPC used in PPM-CDMA systems are derived and compared with the respective systems, using a modified prime code (MPC) and a padded modified prime code (PMPC). The nMPC is further applied to the OPPM-CDMA system and the system with a proposed interference cancellation scheme. Our results show that under the same conditions the PPM-CDMA system performances are more improved with the use of nMPC than with the two other traditional codes. Moreover, they show that the system performances are significantly enhanced by the proposed interference reduction methods, if the nMPC is used in the OPPM-CDMA systems.

  3. JAERI thermal reactor standard code system for reactor design and analysis SRAC

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro

    1985-01-01

    SRAC, JAERI thermal reactor standard code system for reactor design and analysis, developed in Japan Atomic Energy Research Institute, is for all types of thermal neutron nuclear design and analysis. The code system has undergone extensive verifications to confirm its functions, and has been used in core modification of the research reactor, detailed design of the multi-purpose high temperature gas reactor and analysis of the experiment with a critical assembly. In nuclear calculation with the code system, multi-group lattice calculation is first made with the libraries. Then, with the resultant homogeneous equivalent group constants, reactor core calculation is made. Described are the following: purpose and development of the code system, functions of the SRAC system, bench mark tests and usage state and future development. (Mori, K.)

  4. A fuzzy genetic approach for network reconfiguration to enhance voltage stability in radial distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Sahoo, N.C. [Faculty of Engineering and Technology, Multimedia University, Jalan Ayer Keroh Lama, Bukit Beruang, 75450 Melaka (Malaysia); Prasad, K. [Faculty of Information Science and Technology, Multimedia University, Jalan Ayer Keroh Lama, Bukit Beruang, 75450 Melaka (Malaysia)

    2006-11-15

    This paper presents a fuzzy genetic approach for reconfiguration of radial distribution systems (RDS) so as to maximize the voltage stability of the network for a specific set of loads. The network reconfiguration involves a mechanism for selection of the best set of branches to be opened, one from each loop, such that the reconfigured RDS possesses desired performance characteristics. This discrete solution space is better handled by the proposed scheme, which maximizes a suitable optimizing function (computed using two different approaches). In the first approach, this function is chosen as the average of a voltage stability index of all the buses in the RDS, while in the second approach, the complete RDS is reduced to a two bus equivalent system and the optimizing function is the voltage stability index of this reduced two bus system. The fuzzy genetic algorithm uses a suitable coding and decoding scheme for maintaining the radial nature of the network at every stage of genetic evolution, and it also uses a fuzzy rule based mutation controller for efficient search of the solution space. This method, tested on 69 bus and 33 bus RDSs, shows promising results for the both approaches. It is also observed that the network losses are reduced when the voltage stability is enhanced by the network reconfiguration. (author)

  5. Coded aperture material motion detection system for the ACPR

    International Nuclear Information System (INIS)

    McArthur, D.A.; Kelly, J.G.

    1975-01-01

    Single LMFBR fuel pins are being irradiated in Sandia's Annular Core Pulsed Reactor (ACPR). In these experiments single fuel pins have been driven well into the melt and vaporization regions in transients with pulse widths of about 5 ms. The ACPR is being upgraded so that it can be used to irradiate bundles of seven LMFBR fuel pins. The coded aperture material motion detection system described is being developed for this upgraded ACPR, and has for its design goals 1 mm transverse resolution (i.e., in the axial and radial directions), depth resolution of a few cm, and time resolution of 0.1 ms. The target date for development of this system is fall 1977. The paper briefly reviews the properties of coded aperture imaging, describes one possible system for the ACPR upgrade, discusses experiments which have been performed to investigate the feasibility of such a system, and describes briefly the further work required to develop such a system. The type of coded aperture to be used has not yet been fixed, but a one-dimensional section of a Fresnel zone plate appears at this time to have significant advantages

  6. Coding Across Multicodes and Time in CDMA Systems Employing MMSE Multiuser Detector

    Directory of Open Access Journals (Sweden)

    Park Jeongsoon

    2004-01-01

    Full Text Available When combining a multicode CDMA system with convolutional coding, two methods have been considered in the literature. In one method, coding is across time in each multicode channel while in the other the coding is across both multicodes and time. In this paper, a performance/complexity analysis of decoding metrics and trellis structures for the two schemes is carried out. It is shown that the latter scheme can exploit the multicode diversity inherent in convolutionally coded direct sequence code division multiple access (DS-CDMA systems which employ minimum mean squared error (MMSE multiuser detectors. In particular, when the MMSE detector provides sufficiently different signal-to-interference ratios (SIRs for the multicode channels, coding across multicodes and time can obtain significant performance gain over coding across time, with nearly the same decoding complexity.

  7. Instance-based Policy Learning by Real-coded Genetic Algorithms and Its Application to Control of Nonholonomic Systems

    Science.gov (United States)

    Miyamae, Atsushi; Sakuma, Jun; Ono, Isao; Kobayashi, Shigenobu

    The stabilization control of nonholonomic systems have been extensively studied because it is essential for nonholonomic robot control problems. The difficulty in this problem is that the theoretical derivation of control policy is not necessarily guaranteed achievable. In this paper, we present a reinforcement learning (RL) method with instance-based policy (IBP) representation, in which control policies for this class are optimized with respect to user-defined cost functions. Direct policy search (DPS) is an approach for RL; the policy is represented by parametric models and the model parameters are directly searched by optimization techniques including genetic algorithms (GAs). In IBP representation an instance consists of a state and an action pair; a policy consists of a set of instances. Several DPSs with IBP have been previously proposed. In these methods, sometimes fail to obtain optimal control policies when state-action variables are continuous. In this paper, we present a real-coded GA for DPSs with IBP. Our method is specifically designed for continuous domains. Optimization of IBP has three difficulties; high-dimensionality, epistasis, and multi-modality. Our solution is designed for overcoming these difficulties. The policy search with IBP representation appears to be high-dimensional optimization; however, instances which can improve the fitness are often limited to active instances (instances used for the evaluation). In fact, the number of active instances is small. Therefore, we treat the search problem as a low dimensional problem by restricting search variables only to active instances. It has been commonly known that functions with epistasis can be efficiently optimized with crossovers which satisfy the inheritance of statistics. For efficient search of IBP, we propose extended crossover-like mutation (extended XLM) which generates a new instance around an instance with satisfying the inheritance of statistics. For overcoming multi-modality, we

  8. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  9. Genetic architecture of circulating lipid levels

    DEFF Research Database (Denmark)

    Demirkan, Ayşe; Amin, Najaf; Isaacs, Aaron

    2011-01-01

    Serum concentrations of low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), triglycerides (TGs) and total cholesterol (TC) are important heritable risk factors for cardiovascular disease. Although genome-wide association studies (GWASs) of circulating lipid...... the ENGAGE Consortium GWAS on serum lipids, were applied to predict lipid levels in an independent population-based study, the Rotterdam Study-II (RS-II). We additionally tested for evidence of a shared genetic basis for different lipid phenotypes. Finally, the polygenic score approach was used to identify...... an alternative genome-wide significance threshold before pathway analysis and those results were compared with those based on the classical genome-wide significance threshold. Our study provides evidence suggesting that many loci influencing circulating lipid levels remain undiscovered. Cross-prediction models...

  10. Breathing (and Coding?) a Bit Easier: Changes to International Classification of Disease Coding for Pulmonary Hypertension.

    Science.gov (United States)

    Mathai, Stephen C; Mathew, Sherin

    2018-04-20

    International Classification of Disease (ICD) coding system is broadly utilized by healthcare providers, hospitals, healthcare payers, and governments to track health trends and statistics at the global, national, and local levels and to provide a reimbursement framework for medical care based upon diagnosis and severity of illness. The current iteration of the ICD system, ICD-10, was implemented in 2015. While many changes to the prior ICD-9 system were included in the ICD-10 system, the newer revision failed to adequately reflect advances in the clinical classification of certain diseases such as pulmonary hypertension (PH). Recently, a proposal to modify the ICD-10 codes for PH was considered and ultimately adopted for inclusion as updates to ICD-10 coding system. While these revisions better reflect the current clinical classification of PH, in the future, further changes should be considered to improve the accuracy and ease of coding for all forms of PH. Copyright © 2018. Published by Elsevier Inc.

  11. All about Genetics (For Parents)

    Science.gov (United States)

    ... Videos for Educators Search English Español All About Genetics KidsHealth / For Parents / All About Genetics What's in ... the way they pick up special laboratory dyes. Genetic Problems Errors in the genetic code or "gene ...

  12. Improving system modeling accuracy with Monte Carlo codes

    International Nuclear Information System (INIS)

    Johnson, A.S.

    1996-01-01

    The use of computer codes based on Monte Carlo methods to perform criticality calculations has become common-place. Although results frequently published in the literature report calculated k eff values to four decimal places, people who use the codes in their everyday work say that they only believe the first two decimal places of any result. The lack of confidence in the computed k eff values may be due to the tendency of the reported standard deviation to underestimate errors associated with the Monte Carlo process. The standard deviation as reported by the codes is the standard deviation of the mean of the k eff values for individual generations in the computer simulation, not the standard deviation of the computed k eff value compared with the physical system. A more subtle problem with the standard deviation of the mean as reported by the codes is that all the k eff values from the separate generations are not statistically independent since the k eff of a given generation is a function of k eff of the previous generation, which is ultimately based on the starting source. To produce a standard deviation that is more representative of the physical system, statistically independent values of k eff are needed

  13. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  14. Codon usage and expression level of human mitochondrial 13 protein coding genes across six continents.

    Science.gov (United States)

    Chakraborty, Supriyo; Uddin, Arif; Mazumder, Tarikul Huda; Choudhury, Monisha Nath; Malakar, Arup Kumar; Paul, Prosenjit; Halder, Binata; Deka, Himangshu; Mazumder, Gulshana Akthar; Barbhuiya, Riazul Ahmed; Barbhuiya, Masuk Ahmed; Devi, Warepam Jesmi

    2017-12-02

    The study of codon usage coupled with phylogenetic analysis is an important tool to understand the genetic and evolutionary relationship of a gene. The 13 protein coding genes of human mitochondria are involved in electron transport chain for the generation of energy currency (ATP). However, no work has yet been reported on the codon usage of the mitochondrial protein coding genes across six continents. To understand the patterns of codon usage in mitochondrial genes across six different continents, we used bioinformatic analyses to analyze the protein coding genes. The codon usage bias was low as revealed from high ENC value. Correlation between codon usage and GC3 suggested that all the codons ending with G/C were positively correlated with GC3 but vice versa for A/T ending codons with the exception of ND4L and ND5 genes. Neutrality plot revealed that for the genes ATP6, COI, COIII, CYB, ND4 and ND4L, natural selection might have played a major role while mutation pressure might have played a dominant role in the codon usage bias of ATP8, COII, ND1, ND2, ND3, ND5 and ND6 genes. Phylogenetic analysis indicated that evolutionary relationships in each of 13 protein coding genes of human mitochondria were different across six continents and further suggested that geographical distance was an important factor for the origin and evolution of 13 protein coding genes of human mitochondria. Copyright © 2017 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  15. System control fuzzy neural sewage pumping stations using genetic algorithms

    Directory of Open Access Journals (Sweden)

    Владлен Николаевич Кузнецов

    2015-06-01

    Full Text Available It is considered the system of management of sewage pumping station with regulators based on a neuron network with fuzzy logic. Linguistic rules for the controller based on fuzzy logic, maintaining the level of effluent in the receiving tank within the prescribed limits are developed. The use of genetic algorithms for neuron network training is shown.

  16. CASKETSS: a computer code system for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1989-02-01

    A computer program CASKETSS has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS measn a modular code system for CASK Evaluation code system Thermal and Structural Safety. Main features of CASKETSS are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) Some of the computer programs in the code system has been programmed to provide near optimal speed on vector processing computers. (3) Data libralies fro thermal and structural analysis are provided in the code system. (4) Input data generator is provided in the code system. (5) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  17. Development Perspective of Regulatory Audit Code System for SFR Nuclear Safety Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Moo Hoon; Lee, Gil Soo; Shin, An Dong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-05-15

    A sodium-cooled fast reactor (SFR) in Korea is based on the KALIMER-600 concept developed by KAERI. Based on 'Long-term R and D Plan for Future Reactor Systems' which was approved by the Korea Atomic Energy Commission in 2008, the KAERI designer is scheduled to apply the design certification of the prototype SFR in 2017. In order to establish regulatory infrastructure for the licensing of a prototype SFR, KINS has develop the regulatory requirements for the demonstration SFR since 2010, and are scheduled to develop the regulatory audit code systems in regard to core, fuel, and system, etc. since 2012. In this study, the domestic code systems used for core design and safety evaluation of PWRs and the nuclear physics and code system for SFRs were briefly reviewed, and the development perspective of regulatory audit code system for SFR nuclear safety evaluation were derived

  18. Symbol synchronization in convolutionally coded systems

    Science.gov (United States)

    Baumert, L. D.; Mceliece, R. J.; Van Tilborg, H. C. A.

    1979-01-01

    Alternate symbol inversion is sometimes applied to the output of convolutional encoders to guarantee sufficient richness of symbol transition for the receiver symbol synchronizer. A bound is given for the length of the transition-free symbol stream in such systems, and those convolutional codes are characterized in which arbitrarily long transition free runs occur.

  19. Genic non-coding microsatellites in the rice genome: characterization, marker design and use in assessing genetic and evolutionary relationships among domesticated groups

    Directory of Open Access Journals (Sweden)

    Singh Nagendra

    2009-03-01

    Full Text Available Abstract Background Completely sequenced plant genomes provide scope for designing a large number of microsatellite markers, which are useful in various aspects of crop breeding and genetic analysis. With the objective of developing genic but non-coding microsatellite (GNMS markers for the rice (Oryza sativa L. genome, we characterized the frequency and relative distribution of microsatellite repeat-motifs in 18,935 predicted protein coding genes including 14,308 putative promoter sequences. Results We identified 19,555 perfect GNMS repeats with densities ranging from 306.7/Mb in chromosome 1 to 450/Mb in chromosome 12 with an average of 357.5 GNMS per Mb. The average microsatellite density was maximum in the 5' untranslated regions (UTRs followed by those in introns, promoters, 3'UTRs and minimum in the coding sequences (CDS. Primers were designed for 17,966 (92% GNMS repeats, including 4,288 (94% hypervariable class I types, which were bin-mapped on the rice genome. The GNMS markers were most polymorphic in the intronic region (73.3% followed by markers in the promoter region (53.3% and least in the CDS (26.6%. The robust polymerase chain reaction (PCR amplification efficiency and high polymorphic potential of GNMS markers over genic coding and random genomic microsatellite markers suggest their immediate use in efficient genotyping applications in rice. A set of these markers could assess genetic diversity and establish phylogenetic relationships among domesticated rice cultivar groups. We also demonstrated the usefulness of orthologous and paralogous conserved non-coding microsatellite (CNMS markers, identified in the putative rice promoter sequences, for comparative physical mapping and understanding of evolutionary and gene regulatory complexities among rice and other members of the grass family. The divergence between long-grained aromatics and subspecies japonica was estimated to be more recent (0.004 Mya compared to short

  20. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Directory of Open Access Journals (Sweden)

    Nawawi N. M.

    2017-01-01

    Full Text Available In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA system using Zero Cross Correlation (ZCC code and multiband Orthogonal Frequency Division Multiplexing (OFDM called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.