WorldWideScience

Sample records for standard genetic code

  1. Representation mutations from standard genetic codes

    Science.gov (United States)

    Aisah, I.; Suyudi, M.; Carnia, E.; Suhendi; Supriatna, A. K.

    2018-03-01

    Graph is widely used in everyday life especially to describe model problem and describe it concretely and clearly. In addition graph is also used to facilitate solve various kinds of problems that are difficult to be solved by calculation. In Biology, graph can be used to describe the process of protein synthesis in DNA. Protein has an important role for DNA (deoxyribonucleic acid) or RNA (ribonucleic acid). Proteins are composed of amino acids. In this study, amino acids are related to genetics, especially the genetic code. The genetic code is also known as the triplet or codon code which is a three-letter arrangement of DNA nitrogen base. The bases are adenine (A), thymine (T), guanine (G) and cytosine (C). While on RNA thymine (T) is replaced with Urasil (U). The set of all Nitrogen bases in RNA is denoted by N = {C U, A, G}. This codon works at the time of protein synthesis inside the cell. This codon also encodes the stop signal as a sign of the stop of protein synthesis process. This paper will examine the process of protein synthesis through mathematical studies and present it in three-dimensional space or graph. The study begins by analysing the set of all codons denoted by NNN such that to obtain geometric representations. At this stage there is a matching between the sets of all nitrogen bases N with Z 2 × Z 2; C=(\\overline{0},\\overline{0}),{{U}}=(\\overline{0},\\overline{1}),{{A}}=(\\overline{1},\\overline{0}),{{G}}=(\\overline{1},\\overline{1}). By matching the algebraic structure will be obtained such as group, group Klein-4,Quotien group etc. With the help of Geogebra software, the set of all codons denoted by NNN can be presented in a three-dimensional space as a multicube NNN and also can be represented as a graph, so that can easily see relationship between the codon.

  2. Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal

    Science.gov (United States)

    Zamudio, Gabriel S.; José, Marco V.

    2018-03-01

    In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.

  3. Genetic hotels for the standard genetic code: evolutionary analysis based upon novel three-dimensional algebraic models.

    Science.gov (United States)

    José, Marco V; Morgado, Eberto R; Govezensky, Tzipe

    2011-07-01

    Herein, we rigorously develop novel 3-dimensional algebraic models called Genetic Hotels of the Standard Genetic Code (SGC). We start by considering the primeval RNA genetic code which consists of the 16 codons of type RNY (purine-any base-pyrimidine). Using simple algebraic operations, we show how the RNA code could have evolved toward the current SGC via two different intermediate evolutionary stages called Extended RNA code type I and II. By rotations or translations of the subset RNY, we arrive at the SGC via the former (type I) or via the latter (type II), respectively. Biologically, the Extended RNA code type I, consists of all codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The Extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. Since the dimensions of remarkable subsets of the Genetic Hotels are not necessarily integer numbers, we also introduce the concept of algebraic fractal dimension. A general decoding function which maps each codon to its corresponding amino acid or the stop signals is also derived. The Phenotypic Hotel of amino acids is also illustrated. The proposed evolutionary paths are discussed in terms of the existing theories of the evolution of the SGC. The adoption of 3-dimensional models of the Genetic and Phenotypic Hotels will facilitate the understanding of the biological properties of the SGC.

  4. The standard genetic code and its relation to mutational pressure: robustness and equilibrium criteria

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Martinez Ortiz, Carlos; Sautie Castellanos, Miguel; Valdes, Kiria; Guevara Erra, Ramon

    2004-10-01

    Under the assumption of even point mutation pressure on the DNA strand, rates for transitions from one amino acid into another were assessed. Nearly 25% of all mutations were silent. About 48% of the mutations from a given amino acid stream either into the same amino acid or into an amino acid of the same class. These results suggest a great stability of the Standard Genetic Code respect to mutation load. Concepts from chemical equilibrium theory are applicable into this case provided that mutation rate constants are given. It was obtained that unequal synonymic codon usage may lead to changes in the equilibrium concentrations. Data from real biological species showed that several amino acids are close to the respective equilibrium concentration. However in all the cases the concentration of leucine nearly doubled its equilibrium concentration, whereas for the stop command (Term) it was about 10 times lower. The overall distance from equilibrium for a set of species suggests that eukaryotes are closer to equilibrium than prokaryotes, and the HIV virus was closest to equilibrium among 15 species. We obtained that contemporary species are closer to the equilibrium than the Last Universal Common Ancestor (LUCA) was. Similarly, nonpreserved regions in proteins are closer to equilibrium than the preserved ones. We suggest that this approach can be useful for exploring some aspects of biological evolution in the framework of Standard Genetic Code properties. (author)

  5. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  6. Evolutionary implications of genetic code deviations

    International Nuclear Information System (INIS)

    Chela Flores, J.

    1986-07-01

    By extending the standard genetic code into a temperature dependent regime, we propose a train of molecular events leading to alternative coding. The first few examples of these deviations have already been reported in some ciliated protozoans and Gram positive bacteria. A possible range of further alternative coding, still within the context of universality, is pointed out. (author)

  7. What Froze the Genetic Code?

    Directory of Open Access Journals (Sweden)

    Lluís Ribas de Pouplana

    2017-04-01

    Full Text Available The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  8. What Froze the Genetic Code?

    Science.gov (United States)

    Ribas de Pouplana, Lluís; Torres, Adrian Gabriel; Rafels-Ybern, Àlbert

    2017-04-05

    The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  9. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  10. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  11. Genetic coding and gene expression - new Quadruplet genetic coding model

    Science.gov (United States)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  12. Computation of the Genetic Code

    Science.gov (United States)

    Kozlov, Nicolay N.; Kozlova, Olga N.

    2018-03-01

    One of the problems in the development of mathematical theory of the genetic code (summary is presented in [1], the detailed -to [2]) is the problem of the calculation of the genetic code. Similar problems in the world is unknown and could be delivered only in the 21st century. One approach to solving this problem is devoted to this work. For the first time provides a detailed description of the method of calculation of the genetic code, the idea of which was first published earlier [3]), and the choice of one of the most important sets for the calculation was based on an article [4]. Such a set of amino acid corresponds to a complete set of representations of the plurality of overlapping triple gene belonging to the same DNA strand. A separate issue was the initial point, triggering an iterative search process all codes submitted by the initial data. Mathematical analysis has shown that the said set contains some ambiguities, which have been founded because of our proposed compressed representation of the set. As a result, the developed method of calculation was limited to the two main stages of research, where the first stage only the of the area were used in the calculations. The proposed approach will significantly reduce the amount of computations at each step in this complex discrete structure.

  13. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  14. Building Standards and Codes for Energy Conservation

    Science.gov (United States)

    Gross, James G.; Pierlert, James H.

    1977-01-01

    Current activity intended to lead to energy conservation measures in building codes and standards is reviewed by members of the Office of Building Standards and Codes Services of the National Bureau of Standards. For journal availability see HE 508 931. (LBH)

  15. Nondestructive testing standards and the ASME code

    International Nuclear Information System (INIS)

    Spanner, J.C.

    1991-04-01

    Nondestructive testing (NDT) requirements and standards are an important part of the ASME Boiler and Pressure Vessel Code. In this paper, the evolution of these requirements and standards is reviewed in the context of the unique technical and legal stature of the ASME Code. The coherent and consistent manner by which the ASME Code rules are organized is described, and the interrelationship between the various ASME Code sections, the piping codes, and the ASTM Standards is discussed. Significant changes occurred in ASME Sections 5 and 11 during the 1980s, and these are highlighted along with projections and comments regarding future trends and changes in these important documents. 4 refs., 8 tabs

  16. Codes and Standards Technical Team Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    None

    2013-06-01

    The Hydrogen Codes and Standards Tech Team (CSTT) mission is to enable and facilitate the appropriate research, development, & demonstration (RD&D) for the development of safe, performance-based defensible technical codes and standards that support the technology readiness and are appropriate for widespread consumer use of fuel cells and hydrogen-based technologies with commercialization by 2020. Therefore, it is important that the necessary codes and standards be in place no later than 2015.

  17. Electrical, instrumentation, and control codes and standards

    International Nuclear Information System (INIS)

    Kranning, A.N.

    1978-01-01

    During recent years numerous documents in the form of codes and standards have been developed and published to provide design, fabrication and construction rules and criteria applicable to instrumentation, control and power distribution facilities for nuclear power plants. The contents of this LTR were prepared by NUS Corporation under Subcontract K5108 and provide a consolidated index and listing of the documents selected for their application to procurement of materials and design of modifications and new construction at the LOFT facility. These codes and standards should be applied together with the National Electrical Code, the ID Engineering Standards and LOFT Specifications to all LOFT instrument and electrical design activities

  18. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  19. Recent activities on nuclear codes and standards

    International Nuclear Information System (INIS)

    Minematsu, Akiyoshi; Ishimoto, Shozaburo; Honjin, Masao

    2000-01-01

    The technical codes and standards relating to the nuclear power stations in Japan are prepared by shapes of laws (ministerial ordinances and bulletins) issued by the government and obliged to comply with by 'the Law concerning the Regulations of Nuclear Material Substances, Nuclear Fuel Substances and Nuclear Reactors' and 'the Electricity Business Act' and of guides defined by the Nuclear Safety Commission, and further some private standards have been issued at a shape of complement of these laws and guides by receiving national recommendation. On the other hand, in the fields of electricity and heat facilities except atomic energy, simplification and feature stipulation of the national technical codes and standards was recently carried out, by which a system usable for the private standards in and out of Japan were prepared through approval of the private Japan Electrotechnical Standards and Codes Committee (JESC). As the nuclear field was now excepted from simultaneous transfer to the private standard and the standard application system, it is expected in future to realize similar transfer if possible and preparation of the private standards is now being advanced. Here were introduced on present state on technical codes and standards relating to the nuclear power generation facilities and recent trends on their private standardization. (G.K.)

  20. The Search for Symmetries in the Genetic Code:

    Science.gov (United States)

    Antoneli, Fernando; Forger, Michael; Hornos, José Eduardo M.

    We give a full classification of the possible schemes for obtaining the distribution of multiplets observed in the standard genetic code by symmetry breaking in the context of finite groups, based on an extended notion of partial symmetry breaking that incorporates the intuitive idea of "freezing" first proposed by Francis Crick, which is given a precise mathematical meaning.

  1. The Genetic Code: Yesterday, Today and Tomorrow

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 17; Issue 12. The Genetic Code: Yesterday, Today and Tomorrow. Jiqiang Ling Dieter Söll. General Article Volume 17 Issue 12 December 2012 pp 1136-1142. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Rulemaking efforts on codes and standards

    International Nuclear Information System (INIS)

    Millman, G.C.

    1992-01-01

    Section 50.55a of the NRC regulations provides a mechanism for incorporating national codes and standards into the regulatory process. It incorporates by reference ASME Boiler and Pressure Vessel Code (ASME B and PV Code) Section 3 rules for construction and Section 11 rules for inservice inspection and inservice testing. The regulation is periodically amended to update these references. The rulemaking process, as applied to Section 50.55a amendments, is overviewed to familiarize users with associated internal activities of the NRC staff and the manner in which public comments are integrated into the process. The four ongoing rulemaking actions that would individually amend Section 50.55a are summarized. Two of the actions would directly impact requirements for inservice testing. Benefits accrued with NRC endorsement of the ASME B and PV Code, and possible future endorsement of the ASME Operations and Maintenance Code (ASME OM Code), are identified. Emphasis is placed on the need for code writing committees to be especially sensitive to user feedback on code rules incorporated into the regulatory process to ensure that the rules are complete, technically accurate, clear, practical, and enforceable

  3. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  4. Canadian energy standards : residential energy code requirements

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, K. [SAR Engineering Ltd., Burnaby, BC (Canada)

    2006-09-15

    A survey of residential energy code requirements was discussed. New housing is approximately 13 per cent more efficient than housing built 15 years ago, and more stringent energy efficiency requirements in building codes have contributed to decreased energy use and greenhouse gas (GHG) emissions. However, a survey of residential energy codes across Canada has determined that explicit demands for energy efficiency are currently only present in British Columbia (BC), Manitoba, Ontario and Quebec. The survey evaluated more than 4300 single-detached homes built between 2000 and 2005 using data from the EnerGuide for Houses (EGH) database. House area, volume, airtightness and construction characteristics were reviewed to create archetypes for 8 geographic areas. The survey indicated that in Quebec and the Maritimes, 90 per cent of houses comply with ventilation system requirements of the National Building Code, while compliance in the rest of Canada is much lower. Heat recovery ventilation use is predominant in the Atlantic provinces. Direct-vent or condensing furnaces constitute the majority of installed systems in provinces where natural gas is the primary space heating fuel. Details of Insulation levels for walls, double-glazed windows, and building code insulation standards were also reviewed. It was concluded that if R-2000 levels of energy efficiency were applied, total average energy consumption would be reduced by 36 per cent in Canada. 2 tabs.

  5. Data Representation, Coding, and Communication Standards.

    Science.gov (United States)

    Amin, Milon; Dhir, Rajiv

    2015-06-01

    The immense volume of cases signed out by surgical pathologists on a daily basis gives little time to think about exactly how data are stored. An understanding of the basics of data representation has implications that affect a pathologist's daily practice. This article covers the basics of data representation and its importance in the design of electronic medical record systems. Coding in surgical pathology is also discussed. Finally, a summary of communication standards in surgical pathology is presented, including suggested resources that establish standards for select aspects of pathology reporting. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  7. On coding genotypes for genetic markers with multiple alleles in genetic association study of quantitative traits

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2011-09-01

    Full Text Available Abstract Background In genetic association study of quantitative traits using F∞ models, how to code the marker genotypes and interpret the model parameters appropriately is important for constructing hypothesis tests and making statistical inferences. Currently, the coding of marker genotypes in building F∞ models has mainly focused on the biallelic case. A thorough work on the coding of marker genotypes and interpretation of model parameters for F∞ models is needed especially for genetic markers with multiple alleles. Results In this study, we will formulate F∞ genetic models under various regression model frameworks and introduce three genotype coding schemes for genetic markers with multiple alleles. Starting from an allele-based modeling strategy, we first describe a regression framework to model the expected genotypic values at given markers. Then, as extension from the biallelic case, we introduce three coding schemes for constructing fully parameterized one-locus F∞ models and discuss the relationships between the model parameters and the expected genotypic values. Next, under a simplified modeling framework for the expected genotypic values, we consider several reduced one-locus F∞ models from the three coding schemes on the estimability and interpretation of their model parameters. Finally, we explore some extensions of the one-locus F∞ models to two loci. Several fully parameterized as well as reduced two-locus F∞ models are addressed. Conclusions The genotype coding schemes provide different ways to construct F∞ models for association testing of multi-allele genetic markers with quantitative traits. Which coding scheme should be applied depends on how convenient it can provide the statistical inferences on the parameters of our research interests. Based on these F∞ models, the standard regression model fitting tools can be used to estimate and test for various genetic effects through statistical contrasts with the

  8. Globalization of ASME Nuclear Codes and Standards

    International Nuclear Information System (INIS)

    Swayne, Rick; Erler, Bryan A.

    2006-01-01

    With the globalization of the nuclear industry, it is clear that the reactor suppliers are based in many countries around the world (such as United States, France, Japan, Canada, South Korea, South Africa) and they will be marketing their reactors to many countries around the world (such as US, China, South Korea, France, Canada, Finland, Taiwan). They will also be fabricating their components in many different countries around the world. With this situation, it is clear that the requirements of ASME Nuclear Codes and Standards need to be adjusted to accommodate the regulations, fabricating processes, and technology of various countries around the world. It is also very important for the American Society of Mechanical Engineers (ASME) to be able to assure that products meeting the applicable ASME Code requirements will provide the same level of safety and quality assurance as those products currently fabricated under the ASME accreditation process. To do this, many countries are in the process of establishing or changing their regulations, and it is important for ASME to interface with the appropriate organizations in those countries, in order to ensure there is effective use of ASME Codes and standards around the world. (authors)

  9. HOW TO REPRESENT THE GENETIC CODE?

    Directory of Open Access Journals (Sweden)

    N.S. Santos-Magalhães

    2004-05-01

    Full Text Available The advent of molecular genetic comprises a true revolution of far-reaching consequences for human-kind, which evolved into a specialized branch of the modern-day Biochemistry. The analysis of specicgenomic information are gaining wide-ranging interest because of their signicance to the early diag-nosis of disease, and the discovery of modern drugs. In order to take advantage of a wide assortmentof signal processing (SP algorithms, the primary step of modern genomic SP involves convertingsymbolic-DNA sequences into complex-valued signals. How to represent the genetic code? Despitebeing extensively known, the DNA mapping into proteins is one of the relevant discoveries of genetics.The genetic code (GC is revisited in this work, addressing other descriptions for it, which can beworthy for genomic SP. Three original representations are discussed. The inner-to-outer map buildson the unbalanced role of nucleotides of a codon. A two-dimensional-Gray genetic representationis oered as a structured map that can help interpreting DNA spectrograms or scalograms. Theseare among the powerful visual tools for genome analysis, which depends on the choice of the geneticmapping. Finally, the world-chart for the GC is investigated. Evoking the cyclic structure of thegenetic mapping, it can be folded joining the left-right borders, and the top-bottom frontiers. As aresult, the GC can be drawn on the surface of a sphere resembling a world-map. Eight parallels oflatitude are required (four in each hemisphere as well as four meridians of longitude associated tofour corresponding anti-meridians. The tropic circles have 11.25o, 33.75o, 56.25o, and 78.5o (Northand South. Starting from an arbitrary Greenwich meridian, the meridians of longitude can be plottedat 22.5o, 67.5o, 112.5o, and 157.5o (East and West. Each triplet is assigned to a single point on thesurface that we named Nirenberg-Kohamas Earth. Despite being valuable, usual representations forthe GC can be

  10. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  11. Alternative Fuels Data Center: Codes and Standards Resources

    Science.gov (United States)

    resources linked below help project developers and code officials prepare and review code-compliant projects , storage, and infrastructure. The following charts show the SDOs responsible for these alternative fuel codes and standards. Biodiesel Vehicle and Infrastructure Codes and Standards Chart Electric Vehicle and

  12. Review of codes, standards, and regulations for natural gas locomotives.

    Science.gov (United States)

    2014-06-01

    This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...

  13. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments.

    Science.gov (United States)

    Santos, José; Monteagudo, Angel

    2011-02-21

    As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the fact that the best possible codes show the patterns of the

  14. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments

    Directory of Open Access Journals (Sweden)

    Monteagudo Ángel

    2011-02-01

    Full Text Available Abstract Background As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Results Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Conclusions Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the

  15. Flexibility of the genetic code with respect to DNA structure

    DEFF Research Database (Denmark)

    Baisnée, P. F.; Baldi, Pierre; Brunak, Søren

    2001-01-01

    Motivation. The primary function of DNA is to carry genetic information through the genetic code. DNA, however, contains a variety of other signals related, for instance, to reading frame, codon bias, pairwise codon bias, splice sites and transcription regulation, nucleosome positioning and DNA...... structure. Here we study the relationship between the genetic code and DNA structure and address two questions. First, to which degree does the degeneracy of the genetic code and the acceptable amino acid substitution patterns allow for the superimposition of DNA structural signals to protein coding...... sequences? Second, is the origin or evolution of the genetic code likely to have been constrained by DNA structure? Results. We develop an index for code flexibility with respect to DNA structure. Using five different di- or tri-nucleotide models of sequence-dependent DNA structure, we show...

  16. Review and comparison of WWER and LWR Codes and Standards

    International Nuclear Information System (INIS)

    Buckthorpe, D.; Tashkinov, A.; Brynda, J.; Davies, L.M.; Cueto-Felgeueroso, C.; Detroux, P.; Bieniussa, K.; Guinovart, J.

    2003-01-01

    The results of work on a collaborative project on comparison of Codes and Standards used for safety related components of the WWER and LWR type reactors is presented. This work was performed on behalf of the European Commission, Working Group Codes and Standards and considers areas such as rules, criteria and provisions, failure mechanisms , derivation and understanding behind the fatigue curves, piping, materials and aging, manufacturing and ISI. WWERs are essentially designed and constructed using the Russian PNAE Code together with special provisions in a few countries (e.g. Czech Republic) from national standards. The LWR Codes have a strong dependence on the ASME Code. Also within Western Europe other codes are used including RCC-M, KTA and British Standards. A comparison of procedures used in all these codes and standards have been made to investigate the potential for equivalencies between the codes and any grounds for future cooperation between eastern and western experts in this field. (author)

  17. A multiobjective approach to the genetic code adaptability problem.

    Science.gov (United States)

    de Oliveira, Lariza Laura; de Oliveira, Paulo S L; Tinós, Renato

    2015-02-19

    The organization of the canonical code has intrigued researches since it was first described. If we consider all codes mapping the 64 codes into 20 amino acids and one stop codon, there are more than 1.51×10(84) possible genetic codes. The main question related to the organization of the genetic code is why exactly the canonical code was selected among this huge number of possible genetic codes. Many researchers argue that the organization of the canonical code is a product of natural selection and that the code's robustness against mutations would support this hypothesis. In order to investigate the natural selection hypothesis, some researches employ optimization algorithms to identify regions of the genetic code space where best codes, according to a given evaluation function, can be found (engineering approach). The optimization process uses only one objective to evaluate the codes, generally based on the robustness for an amino acid property. Only one objective is also employed in the statistical approach for the comparison of the canonical code with random codes. We propose a multiobjective approach where two or more objectives are considered simultaneously to evaluate the genetic codes. In order to test our hypothesis that the multiobjective approach is useful for the analysis of the genetic code adaptability, we implemented a multiobjective optimization algorithm where two objectives are simultaneously optimized. Using as objectives the robustness against mutation with the amino acids properties polar requirement (objective 1) and robustness with respect to hydropathy index or molecular volume (objective 2), we found solutions closer to the canonical genetic code in terms of robustness, when compared with the results using only one objective reported by other authors. Using more objectives, more optimal solutions are obtained and, as a consequence, more information can be used to investigate the adaptability of the genetic code. The multiobjective approach

  18. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  19. National Society of Genetic Counselors Code of Ethics.

    Science.gov (United States)

    2018-02-01

    This document is the revised Code of Ethics of the National Society of Genetic Counselors (NSGC) that was adopted in April 2017 after majority vote of the full membership of the NSGC. The explication of the revisions is published in this volume of the Journal of Genetic Counseling. This is the fourth revision to the Code of Ethics since its original adoption in 1992.

  20. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    Science.gov (United States)

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. C++ Coding Standards 101 Rules, Guidelines, and Best Practices

    CERN Document Server

    Sutter, Herb

    2005-01-01

    Consistent, high-quality coding standards improve software quality, reduce time-to-market, promote teamwork, eliminate time wasted on inconsequential matters, and simplify maintenance. Now, two of the world's most respected C++ experts distill the rich collective experience of the global C++ community into a set of coding standards that every developer and development team can understand and use as a basis for their own coding standards.

  2. Running code as part of an open standards policy

    OpenAIRE

    Shah, Rajiv; Kesan, Jay

    2009-01-01

    Governments around the world are considering implementing or even mandating open standards policies. They believe these policies will provide economic, socio-political, and technical benefits. In this article, we analyze the failure of the Massachusetts’s open standards policy as applied to document formats. We argue it failed due to the lack of running code. Running code refers to multiple independent, interoperable implementations of an open standard. With running code, users have choice ...

  3. The coevolution of genes and genetic codes: Crick's frozen accident revisited.

    Science.gov (United States)

    Sella, Guy; Ardell, David H

    2006-09-01

    The standard genetic code is the nearly universal system for the translation of genes into proteins. The code exhibits two salient structural characteristics: it possesses a distinct organization that makes it extremely robust to errors in replication and translation, and it is highly redundant. The origin of these properties has intrigued researchers since the code was first discovered. One suggestion, which is the subject of this review, is that the code's organization is the outcome of the coevolution of genes and genetic codes. In 1968, Francis Crick explored the possible implications of coevolution at different stages of code evolution. Although he argues that coevolution was likely to influence the evolution of the code, he concludes that it falls short of explaining the organization of the code we see today. The recent application of mathematical modeling to study the effects of errors on the course of coevolution, suggests a different conclusion. It shows that coevolution readily generates genetic codes that are highly redundant and similar in their error-correcting organization to the standard code. We review this recent work and suggest that further affirmation of the role of coevolution can be attained by investigating the extent to which the outcome of coevolution is robust to other influences that were present during the evolution of the code.

  4. Telemetry Standards, RCC Standard 106-17, Chapter 4, Pulse Code Modulation Standards

    Science.gov (United States)

    2017-07-01

    A-4 Appendix 4-B. Citations ...investigation can be found in a paper by J. L. Maury, Jr. and J. Styles , “Development of Optimum Frame Synchronization Codes for Goddard Space Flight Center...Standards, RCC Standard 106-17 Chapter 4, July 2017 B-1 APPENDIX 4-B Citations Aeronautical Radio, Inc. Mark 33 Digital Information Transfer

  5. A genetic code alteration is a phenotype diversity generator in the human pathogen Candida albicans.

    Directory of Open Access Journals (Sweden)

    Isabel Miranda

    Full Text Available BACKGROUND: The discovery of genetic code alterations and expansions in both prokaryotes and eukaryotes abolished the hypothesis of a frozen and universal genetic code and exposed unanticipated flexibility in codon and amino acid assignments. It is now clear that codon identity alterations involve sense and non-sense codons and can occur in organisms with complex genomes and proteomes. However, the biological functions, the molecular mechanisms of evolution and the diversity of genetic code alterations remain largely unknown. In various species of the genus Candida, the leucine CUG codon is decoded as serine by a unique serine tRNA that contains a leucine 5'-CAG-3'anticodon (tRNA(CAG(Ser. We are using this codon identity redefinition as a model system to elucidate the evolution of genetic code alterations. METHODOLOGY/PRINCIPAL FINDINGS: We have reconstructed the early stages of the Candida genetic code alteration by engineering tRNAs that partially reverted the identity of serine CUG codons back to their standard leucine meaning. Such genetic code manipulation had profound cellular consequences as it exposed important morphological variation, altered gene expression, re-arranged the karyotype, increased cell-cell adhesion and secretion of hydrolytic enzymes. CONCLUSION/SIGNIFICANCE: Our study provides the first experimental evidence for an important role of genetic code alterations as generators of phenotypic diversity of high selective potential and supports the hypothesis that they speed up evolution of new phenotypes.

  6. Quality assurance requirements in various codes and standards

    International Nuclear Information System (INIS)

    Shaaban, H.I.; EL-Sayed, A.; Aly, A.E.

    1987-01-01

    The quality assurance requirements in various countries and according to various international codes and standards are presented, compared and critically discussed. Cases of developing countries are also discussed, and the use of IAEA code of practice and other codes for quality assurance in these countries is reviewed. Recommendations are made regarding the quality assurance system to be applied for Egypt's nuclear power plants

  7. Grid Standards and Codes | Grid Modernization | NREL

    Science.gov (United States)

    photovoltaics (PV) adoption have given a sense of urgency to the standards development process. The Accelerating Systems Integration Standards team is addressing this urgency by providing leadership and direction for

  8. Lossless Coding Standards for Space Data Systems

    Science.gov (United States)

    Rice, R. F.

    1996-01-01

    The International Consultative Committee for Space Data Systems (CCSDS) is preparing to issue its first recommendation for a digital data compression standard. Because the space data systems of primary interest are employed to support scientific investigations requiring accurate representation, this initial standard will be restricted to lossless compression.

  9. Critical roles for a genetic code alteration in the evolution of the genus Candida.

    Science.gov (United States)

    Silva, Raquel M; Paredes, João A; Moura, Gabriela R; Manadas, Bruno; Lima-Costa, Tatiana; Rocha, Rita; Miranda, Isabel; Gomes, Ana C; Koerkamp, Marian J G; Perrot, Michel; Holstege, Frank C P; Boucherie, Hélian; Santos, Manuel A S

    2007-10-31

    During the last 30 years, several alterations to the standard genetic code have been discovered in various bacterial and eukaryotic species. Sense and nonsense codons have been reassigned or reprogrammed to expand the genetic code to selenocysteine and pyrrolysine. These discoveries highlight unexpected flexibility in the genetic code, but do not elucidate how the organisms survived the proteome chaos generated by codon identity redefinition. In order to shed new light on this question, we have reconstructed a Candida genetic code alteration in Saccharomyces cerevisiae and used a combination of DNA microarrays, proteomics and genetics approaches to evaluate its impact on gene expression, adaptation and sexual reproduction. This genetic manipulation blocked mating, locked yeast in a diploid state, remodelled gene expression and created stress cross-protection that generated adaptive advantages under environmental challenging conditions. This study highlights unanticipated roles for codon identity redefinition during the evolution of the genus Candida, and strongly suggests that genetic code alterations create genetic barriers that speed up speciation.

  10. Codes and standards and other guidance cited in regulatory documents

    International Nuclear Information System (INIS)

    Nickolaus, J.R.; Bohlander, K.L.

    1996-08-01

    As part of the U.S. Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program (SRP-UDP), Pacific Northwest National Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. The SRP-UDP has been completed and the SRP-Maintenance Program (SRP-MP) is now maintaining this listing. Besides updating previous information, Revision 3 adds approximately 80 citations. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC's Bulletins, Information Notices, Circulars, Enforcement Manual, Generic Letters, Inspection Manual, Policy Statements, Regulatory Guides, Standard Technical Specifications and the Standard Review Plan (NUREG-0800)

  11. Codes and standards and other guidance cited in regulatory documents

    Energy Technology Data Exchange (ETDEWEB)

    Nickolaus, J.R.; Bohlander, K.L.

    1996-08-01

    As part of the U.S. Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program (SRP-UDP), Pacific Northwest National Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. The SRP-UDP has been completed and the SRP-Maintenance Program (SRP-MP) is now maintaining this listing. Besides updating previous information, Revision 3 adds approximately 80 citations. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC`s Bulletins, Information Notices, Circulars, Enforcement Manual, Generic Letters, Inspection Manual, Policy Statements, Regulatory Guides, Standard Technical Specifications and the Standard Review Plan (NUREG-0800).

  12. Codon size reduction as the origin of the triplet genetic code.

    Directory of Open Access Journals (Sweden)

    Pavel V Baranov

    Full Text Available The genetic code appears to be optimized in its robustness to missense errors and frameshift errors. In addition, the genetic code is near-optimal in terms of its ability to carry information in addition to the sequences of encoded proteins. As evolution has no foresight, optimality of the modern genetic code suggests that it evolved from less optimal code variants. The length of codons in the genetic code is also optimal, as three is the minimal nucleotide combination that can encode the twenty standard amino acids. The apparent impossibility of transitions between codon sizes in a discontinuous manner during evolution has resulted in an unbending view that the genetic code was always triplet. Yet, recent experimental evidence on quadruplet decoding, as well as the discovery of organisms with ambiguous and dual decoding, suggest that the possibility of the evolution of triplet decoding from living systems with non-triplet decoding merits reconsideration and further exploration. To explore this possibility we designed a mathematical model of the evolution of primitive digital coding systems which can decode nucleotide sequences into protein sequences. These coding systems can evolve their nucleotide sequences via genetic events of Darwinian evolution, such as point-mutations. The replication rates of such coding systems depend on the accuracy of the generated protein sequences. Computer simulations based on our model show that decoding systems with codons of length greater than three spontaneously evolve into predominantly triplet decoding systems. Our findings suggest a plausible scenario for the evolution of the triplet genetic code in a continuous manner. This scenario suggests an explanation of how protein synthesis could be accomplished by means of long RNA-RNA interactions prior to the emergence of the complex decoding machinery, such as the ribosome, that is required for stabilization and discrimination of otherwise weak triplet codon

  13. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  14. The evolution of the mitochondrial genetic code in arthropods revisited.

    Science.gov (United States)

    Abascal, Federico; Posada, David; Zardoya, Rafael

    2012-04-01

    A variant of the invertebrate mitochondrial genetic code was previously identified in arthropods (Abascal et al. 2006a, PLoS Biol 4:e127) in which, instead of translating the AGG codon as serine, as in other invertebrates, some arthropods translate AGG as lysine. Here, we revisit the evolution of the genetic code in arthropods taking into account that (1) the number of arthropod mitochondrial genomes sequenced has triplicated since the original findings were published; (2) the phylogeny of arthropods has been recently resolved with confidence for many groups; and (3) sophisticated probabilistic methods can be applied to analyze the evolution of the genetic code in arthropod mitochondria. According to our analyses, evolutionary shifts in the genetic code have been more common than previously inferred, with many taxonomic groups displaying two alternative codes. Ancestral character-state reconstruction using probabilistic methods confirmed that the arthropod ancestor most likely translated AGG as lysine. Point mutations at tRNA-Lys and tRNA-Ser correlated with the meaning of the AGG codon. In addition, we identified three variables (GC content, number of AGG codons, and taxonomic information) that best explain the use of each of the two alternative genetic codes.

  15. Mathematical fundamentals for the noise immunity of the genetic code.

    Science.gov (United States)

    Fimmel, Elena; Strüngmann, Lutz

    2018-02-01

    Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of

  16. Unnatural reactive amino acid genetic code additions

    Energy Technology Data Exchange (ETDEWEB)

    Deiters, Alexander; Cropp, T. Ashton; Chin, Jason W.; Anderson, Christopher J.; Schultz, Peter G.

    2017-10-25

    This invention provides compositions and methods for producing translational components that expand the number of genetically encoded amino acids in eukaryotic cells. The components include orthogonal tRNAs, orthogonal aminoacyl-tRNA synthetases, orthogonal pairs of tRNAs/synthetases and unnatural amino acids. Proteins and methods of producing proteins with unnatural amino acids in eukaryotic cells are also provided.

  17. Quantum algorithms and the genetic code

    Indian Academy of Sciences (India)

    the process of replication. One generation of organisms produces the next generation, which is essentially a copy of itself. The self-similarity is maintained by the hereditary information—the genetic code—that is passed on from one generation to the next. The long chains of DNA molecules residing in the nuclei of the cells ...

  18. The status of Korean nuclear codes and standards

    International Nuclear Information System (INIS)

    Namha Kim; Jong-Hae Kim

    2005-01-01

    Korea Electric Power Industry Code (KEPIC), a set of integrated standards applicable to the design, construction and operation of electric power facilities including nuclear power plants, has been developed on the basis of referring to the prevailing U.S. codes and standards which had been applied to the electric power facilities in Korea. Being the developing and managing organization of KEPIC, Korea Electric Association (KEA) published its first edition in 1995, the second in 200,0 and is expected to publish the 2005 edition. KEPIC was applied to the construction of Ulchin Nuclear Units 5 and 6 in 1997, and will be applicable to the construction of forthcoming nuclear power plants in Korea. Along with the effectuation of the Agreement on Technical Barriers to Trade (TBT) in 1995, the international trend related to codes and standards is changing rapidly. The KEA is, therefore, making its utmost efforts so as for KEPIC to keep abreast with the changing environment in international arena. KEA notified ISO/IEC Information Centre of its acceptance of the Code of Good Practice in the Agreement on TBT. The 2005 KEPIC edition will be retrofitted according to the ISO/IEC Guide 21- Adoption of International Standards as regional or national standards. KEA's efforts will help KEPIC correspond with international standards such as ISO/IEC standards, and internationally recognized standards such as ASME codes and standards. (authors)

  19. Comparison of codes and standards for radiographic inspection

    International Nuclear Information System (INIS)

    Bingoeldag, M. M.; Aksu, M.; Akguen, A. F.

    1995-01-01

    This report compares the procedurel requirements and acceptance criteria for radiographic inspections specified in the relevant national and international codes and standards. In particular, detailed analysis of inspection conditions such as exposure arrangements, and contrast requirements are given

  20. Vehicle Codes and Standards: Overview and Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, C.; Buttner, W.; Rivkin, C.

    2010-02-01

    This report identifies gaps in vehicle codes and standards and recommends ways to fill the gaps, focusing on six alternative fuels: biodiesel, natural gas, electricity, ethanol, hydrogen, and propane.

  1. Standard problems for structural computer codes

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.

    1985-01-01

    BNL is investigating the ranges of validity of the analytical methods used to predict the behavior of nuclear safety related structures under accidental and extreme environmental loadings. During FY 85, the investigations were concentrated on special problems that can significantly influence the outcome of the soil structure interaction evaluation process. Specially, limitations and applicability of the standard interaction methods when dealing with lift-off, layering and water table effects, were investigated. This paper describes the work and the results obtained during FY 85 from the studies on lift-off, layering and water-table effects in soil-structure interaction

  2. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  3. ASME nuclear codes and standards risk management strategic plan

    International Nuclear Information System (INIS)

    Balkey, Kenneth R.

    2003-01-01

    Over the past 15 years, several risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to better manage the numerous initiatives in the future, the ASME Board on Nuclear Codes and Standards has recently developed and approved a Risk Management Strategic Plan. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent issuance of the ASME Standard for Probabilistic Risk Assessment (PRA) for Nuclear Power Plant Applications. The paper discusses potential applications within ASME Nuclear Codes and Standards that may require expansion of the PRA Standard, such as for new generation reactors, or the development of new PRA Standards. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, and related U.S. regulatory activities are also summarized. (author)

  4. 76 FR 22383 - National Fire Codes: Request for Proposals for Revision of Codes and Standards

    Science.gov (United States)

    2011-04-21

    ... Chemical Extinguishing Systems. NFPA 22-2008 Standard for Water 5/23/2011 Tanks for Private Fire Protection... Ensembles for Technical Rescue Incidents. NFPA 1925-2008 Standard on Marine Fire- 5/23/2011 Fighting Vessels... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request...

  5. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  6. Real coded genetic algorithm for fuzzy time series prediction

    Science.gov (United States)

    Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.

    2017-10-01

    Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.

  7. ASME nuclear codes and standards risk management strategic planning

    International Nuclear Information System (INIS)

    Hill, Ralph S. III; Balkey, Kenneth R.; Erler, Bryan A.; Wesley Rowley, C.

    2007-01-01

    This paper is prepared in honor and in memory of the late Professor Emeritus Yasuhide Asada to recognize his contributions to ASME Nuclear Codes and Standards initiatives, particularly those related to risk-informed technology and System Based Code developments. For nearly two decades, numerous risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to properly manage the numerous initiatives currently underway or planned for the future, the ASME Board on Nuclear Codes and Standards (BNCS) has an established Risk Management Strategic Plan (Plan) that is maintained and updated by the ASME BNCS Risk Management Task Group. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent probabilistic risk assessment (PRA) standards developments for nuclear power plant applications. The paper discusses planned applications within ASME Nuclear Codes and Standards that will require expansion of the ASME PRA Standard to support new advanced light water reactor and next generation reactor developments, such as for high temperature gas-cooled reactors. Emerging regulatory developments related to risk-informed, performance- based approaches are summarized. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is also summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, including related U.S. regulatory activities. (author)

  8. A Novel Real-coded Quantum-inspired Genetic Algorithm and Its Application in Data Reconciliation

    Directory of Open Access Journals (Sweden)

    Gao Lin

    2012-06-01

    Full Text Available Traditional quantum-inspired genetic algorithm (QGA has drawbacks such as premature convergence, heavy computational cost, complicated coding and decoding process etc. In this paper, a novel real-coded quantum-inspired genetic algorithm is proposed based on interval division thinking. Detailed comparisons with some similar approaches for some standard benchmark functions test validity of the proposed algorithm. Besides, the proposed algorithm is used in two typical nonlinear data reconciliation problems (distilling process and extraction process and simulation results show its efficiency in nonlinear data reconciliation problems.

  9. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang

    2011-06-07

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  10. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang; Yu, Jun

    2011-01-01

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  11. Building climate change into infrastructure codes and standards

    International Nuclear Information System (INIS)

    Auld, H.; Klaasen, J.; Morris, R.; Fernandez, S.; MacIver, D.; Bernstein, D.

    2009-01-01

    'Full text:' Building codes and standards and the climatic design values embedded within these legal to semi-legal documents have profound safety, health and economic implications for Canada's infrastructure systems. The climatic design values that have been used for the design of almost all of today's more than $5.5 Trillion in infrastructure are based on historical climate data and assume that the extremes of the past will represent future conditions. Since new infrastructure based on codes and standards will be built to survive for decades to come, it is critically important that existing climatic design information be as accurate and up-to-date as possible, that the changing climate be monitored to detect and highlight vulnerabilities of existing infrastructure, that forensic studies of climate-related failures be undertaken and that codes and standards processes incorporate future climates and extremes as much as possible. Uncertainties in the current climate change models and their scenarios currently challenge our ability to project future extremes regionally and locally. Improvements to the spatial and temporal resolution of these climate change scenarios, along with improved methodologies to treat model biases and localize results, will allow future codes and standards to better reflect the extremes and weathering conditions expected over the lifespan of structures. In the meantime, other information and code processes can be used to incorporate changing climate conditions into upcoming infrastructure codes and standards, to “bridge” the model uncertainty gap and to complement the state of existing projections. This presentation will outline some of the varied information and processes that will be used to incorporate climate change adaptation into the next development cycle of the National Building Code of Canada and numerous other national CSA infrastructure standards. (author)

  12. Origins of gene, genetic code, protein and life

    Indian Academy of Sciences (India)

    Unknown

    have concluded that newly-born genes are products of nonstop frames (NSF) ... research to determine tertiary structures of proteins such ... the present earth, is favourable for new genes to arise, if ..... NGG) in the universal genetic code table, cannot satisfy ..... which has been proposed to explain the development of life on.

  13. CMCpy: Genetic Code-Message Coevolution Models in Python

    Science.gov (United States)

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  14. 1995 building energy codes and standards workshops: Summary and documentation

    Energy Technology Data Exchange (ETDEWEB)

    Sandahl, L.J.; Shankle, D.L.

    1996-02-01

    During the spring of 1995, Pacific Northwest National Laboratory (PNNL) conducted four two-day Regional Building Energy Codes and Standards workshops across the US. Workshops were held in Chicago, Denver, Rhode Island, and Atlanta. The workshops were designed to benefit state-level officials including staff of building code commissions, energy offices, public utility commissions, and others involved with adopting/updating, implementing, and enforcing building energy codes in their states. The workshops provided an opportunity for state and other officials to learn more about residential and commercial building energy codes and standards, the role of the US Department of Energy and the Building Standards and Guidelines Program at Pacific Northwest National Laboratory, Home Energy Rating Systems (HERS), Energy Efficient Mortgages (EEM), training issues, and other topics related to the development, adoption, implementation, and enforcement of building energy codes. Participants heard success stories, got tips on enforcement training, and received technical support materials. In addition to receiving information on the above topics, workshop participants had an opportunity to provide input on code adoption issues, building industry training issues, building design issues, and exemplary programs across the US. This paper documents the workshop planning, findings, and follow-up processes.

  15. Codes, standards, and requirements for DOE facilities: natural phenomena design

    International Nuclear Information System (INIS)

    Webb, A.B.

    1985-01-01

    The basic requirements for codes, standards, and requirements are found in DOE Orders 5480.1A, 5480.4, and 6430.1. The type of DOE facility to be built and the hazards which it presents will determine the criteria to be applied for natural phenomena design. Mandatory criteria are established in the DOE orders for certain designs but more often recommended guidance is given. National codes and standards form a great body of experience from which the project engineer may draw. Examples of three kinds of facilities and the applicable codes and standards are discussed. The safety program planning approach to project management used at Westinghouse Hanford is outlined. 5 figures, 2 tables

  16. Lightening protection, techniques, applied codes and standards. Vol. 4

    International Nuclear Information System (INIS)

    Mahmoud, M.; Shaaban, H.; Lamey, S.

    1996-01-01

    Lightening is the only natural disaster that protection against is highly effective. Therefore for the safety of critical installations specifically nuclear, an effective lightening protection system (LPS) is required. The design and installation of LPS's have been addressed by many international codes and standards. In this paper, the various LPS's are discussed and compared, including radioactive air terminals, ionizing air terminals, and terminals equipped with electrical trigging devices. Also, the so-called dissipation array systems are discussed and compared to other systems technically and economically. Moreover, the available international codes and standards related to the lightening protection are discussed. such standards include those published by the national fire protection association (NFPA), lightening protection institute (LPI), underwriters laboratories (UL), and british standards Finally, the possibility of developing an egyptian national standards is discussed

  17. 78 FR 24725 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2013-04-26

    ... Production, Storage, and Handling of Liquefied Natural Gas (LNG). NFPA 61--2013 Standard for the 7/6/2015... Nitrate Film. NFPA 51--2013 Standard for the Design 7/6/2015 and Installation of Oxygen-Fuel Gas Systems... Charging Plants. NFPA 52--2013 Vehicular Gaseous Fuel 1/3/2014 Systems Code. NFPA 53--2011 Recommended...

  18. 77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards

    Science.gov (United States)

    2012-11-09

    ... Water Mist Fire P Protection Systems. NFPA 921 Guide for Fire and Explosion P Investigations. NFPA 1005 Standard for Professional P Qualifications for Marine Fire Fighting for Land-Based Fire Fighters. NFPA 1192... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request...

  19. Programming peptidomimetic syntheses by translating genetic codes designed de novo.

    Science.gov (United States)

    Forster, Anthony C; Tan, Zhongping; Nalam, Madhavi N L; Lin, Hening; Qu, Hui; Cornish, Virginia W; Blacklow, Stephen C

    2003-05-27

    Although the universal genetic code exhibits only minor variations in nature, Francis Crick proposed in 1955 that "the adaptor hypothesis allows one to construct, in theory, codes of bewildering variety." The existing code has been expanded to enable incorporation of a variety of unnatural amino acids at one or two nonadjacent sites within a protein by using nonsense or frameshift suppressor aminoacyl-tRNAs (aa-tRNAs) as adaptors. However, the suppressor strategy is inherently limited by compatibility with only a small subset of codons, by the ways such codons can be combined, and by variation in the efficiency of incorporation. Here, by preventing competing reactions with aa-tRNA synthetases, aa-tRNAs, and release factors during translation and by using nonsuppressor aa-tRNA substrates, we realize a potentially generalizable approach for template-encoded polymer synthesis that unmasks the substantially broader versatility of the core translation apparatus as a catalyst. We show that several adjacent, arbitrarily chosen sense codons can be completely reassigned to various unnatural amino acids according to de novo genetic codes by translating mRNAs into specific peptide analog polymers (peptidomimetics). Unnatural aa-tRNA substrates do not uniformly function as well as natural substrates, revealing important recognition elements for the translation apparatus. Genetic programming of peptidomimetic synthesis should facilitate mechanistic studies of translation and may ultimately enable the directed evolution of small molecules with desirable catalytic or pharmacological properties.

  20. CSA guide to Canadian wind turbine codes and standards

    International Nuclear Information System (INIS)

    2008-01-01

    The Canadian wind energy sector has become one of the fastest-growing wind energy markets in the world. Growth of the industry has been supported by various government agencies. However, many projects have experienced cost over-runs or cancellations as a result of unclear regulatory requirements, and wind energy developers are currently subject to a variety of approval processes involving several different authorities. This Canadian Standards Association (CSA) guide provided general information on codes and standards related to the design, approval, installation, operation, and maintenance of wind turbines in Canada. CSA codes and standards were developed by considering 5 new standards adopted by the International Electrotechnical Commission (IEC) Technical Committee on Wind Turbines. The standards described in this document related to acoustic noise measurement techniques; power performance measurements of electricity-producing wind turbines; lightning protection for wind turbine generator systems; design requirements for turbines; and design requirements for small wind turbines. The guide addressed specific subject areas related to the development of wind energy projects that involve formal or regulatory approval processes. Subject areas included issues related to safety, environmental design considerations, site selection, and mechanical systems. Information on associated standards and codes was also included

  1. The genetic code as a periodic table: algebraic aspects.

    Science.gov (United States)

    Bashford, J D; Jarvis, P D

    2000-01-01

    The systematics of indices of physico-chemical properties of codons and amino acids across the genetic code are examined. Using a simple numerical labelling scheme for nucleic acid bases, A=(-1,0), C=(0,-1), G=(0,1), U=(1,0), data can be fitted as low order polynomials of the six coordinates in the 64-dimensional codon weight space. The work confirms and extends the recent studies by Siemion et al. (1995. BioSystems 36, 231-238) of the conformational parameters. Fundamental patterns in the data such as codon periodicities, and related harmonics and reflection symmetries, are here associated with the structure of the set of basis monomials chosen for fitting. Results are plotted using the Siemion one-step mutation ring scheme, and variants thereof. The connections between the present work, and recent studies of the genetic code structure using dynamical symmetry algebras, are pointed out.

  2. Improved entropy encoding for high efficient video coding standard

    Directory of Open Access Journals (Sweden)

    B.S. Sunil Kumar

    2018-03-01

    Full Text Available The High Efficiency Video Coding (HEVC has better coding efficiency, but the encoding performance has to be improved to meet the growing multimedia applications. This paper improves the standard entropy encoding by introducing the optimized weighing parameters, so that higher rate of compression can be accomplished over the standard entropy encoding. The optimization is performed using the recently introduced firefly algorithm. The experimentation is carried out using eight benchmark video sequences and the PSNR for varying rate of data transmission is investigated. Comparative analysis based on the performance statistics is made with the standard entropy encoding. From the obtained results, it is clear that the originality of the decoded video sequence is preserved far better than the proposed method, though the compression rate is increased. Keywords: Entropy, Encoding, HEVC, PSNR, Compression

  3. The "Wow! signal" of the terrestrial genetic code

    Science.gov (United States)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of

  4. Multidimensional electron-photon transport with standard discrete ordinates codes

    International Nuclear Information System (INIS)

    Drumm, C.R.

    1995-01-01

    A method is described for generating electron cross sections that are compatible with standard discrete ordinates codes without modification. There are many advantages of using an established discrete ordinates solver, e.g. immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and man-made radiation environments. The cross sections have been successfully used in the DORT, TWODANT and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electronphoton transport problems

  5. Frozen Accident Pushing 50: Stereochemistry, Expansion, and Chance in the Evolution of the Genetic Code.

    Science.gov (United States)

    Koonin, Eugene V

    2017-05-23

    Nearly 50 years ago, Francis Crick propounded the frozen accident scenario for the evolution of the genetic code along with the hypothesis that the early translation system consisted primarily of RNA. Under the frozen accident perspective, the code is universal among modern life forms because any change in codon assignment would be highly deleterious. The frozen accident can be considered the default theory of code evolution because it does not imply any specific interactions between amino acids and the cognate codons or anticodons, or any particular properties of the code. The subsequent 49 years of code studies have elucidated notable features of the standard code, such as high robustness to errors, but failed to develop a compelling explanation for codon assignments. In particular, stereochemical affinity between amino acids and the cognate codons or anticodons does not seem to account for the origin and evolution of the code. Here, I expand Crick's hypothesis on RNA-only translation system by presenting evidence that this early translation already attained high fidelity that allowed protein evolution. I outline an experimentally testable scenario for the evolution of the code that combines a distinct version of the stereochemical hypothesis, in which amino acids are recognized via unique sites in the tertiary structure of proto-tRNAs, rather than by anticodons, expansion of the code via proto-tRNA duplication, and the frozen accident.

  6. Future direction of ASME nuclear codes and standards

    International Nuclear Information System (INIS)

    Ennis, Kevin; Sheehan, Mark E.

    2003-01-01

    While the nuclear power industry in the US is in a period of stasis, there continues to be a great deal of activity in the ASME nuclear standards development arena. As plants age, the need for new approaches in standardization changes with the changing needs of the industry. New tools are becoming available in the form of risk analysis, and this is finding its way into more and more of ASME's standards activities. This paper will take a look at the direction that ASME nuclear Codes and Standards are heading in this and other areas, as well as taking a look at some advance reactor concepts and plans for standards to address new technologies

  7. ASME nuclear codes and standards: Recent technical initiatives

    International Nuclear Information System (INIS)

    Feigel, R. E.

    1995-01-01

    Although nuclear power construction is currently in a hiatus in the US, ASME and its volunteer committees remain committed to continual improvements in the technical requirements in its nuclear codes. This paper provides an overview of several significant recent revisions to ASME' s nuclear codes. Additionally, other important initiatives currently being addressed by ASME committees will be described. With the largest population of operating light water nuclear plants in the world and worldwide use of its nuclear codes, ASME continues to support technical advancements in its nuclear codes and standards. While revisions of various magnitude are an ongoing process, several recent revisions embody significant changes based on state of the art design philosophy and substantial industry experience. In the design area, a significant revisions has recently been approved which will significantly reduce conservatisms in seismic piping design as well as provide simplified design rules. Major revisions have also been made to the requirements for nuclear material manufacturers and suppliers, which should result in clearer understanding of this difficult administrative area of the code. In the area of Section XI inservice rules, substantial studies are underway to investigate the application of probabilistic, risked based inspection in lieu of the current deterministic inspection philosophy. While much work still is required in this area, it is an important potential application of the emerging field of risk based inspection

  8. The emerging High Efficiency Video Coding standard (HEVC)

    International Nuclear Information System (INIS)

    Raja, Gulistan; Khan, Awais

    2013-01-01

    High definition video (HDV) is becoming popular day by day. This paper describes the performance analysis of latest upcoming video standard known as High Efficiency Video Coding (HEVC). HEVC is designed to fulfil all the requirements for future high definition videos. In this paper, three configurations (intra only, low delay and random access) of HEVC are analyzed using various 480p, 720p and 1080p high definition test video sequences. Simulation results show the superior objective and subjective quality of HEVC

  9. Merits and difficulties in adopting codes, standards and nuclear regulations

    International Nuclear Information System (INIS)

    El-Saiedi, A.F.; Morsy, S.; Mariy, A.

    1978-01-01

    Developing countries planning for introducing nuclear power plants as a source of energy have to develop or adopt sound regulatory practices. These are necessary to help governmental authorities to assess the safety of nuclear power plants and to perform inspections needed to confirm the established safe and sound limits. The first requirement is to form an independent regulatory body capable of setting up and enforcing proper safety regulations. The formation of this body is governed by several considerations related to local conditions in the developing countries, which may not always be favourable. It is quite impractical for countries with limited experience in the nuclear power field to develop their own codes, standards and regulations required for the nuclear regulatory body to perform its tasks. A practical way is to adopt codes, standards and regulations of a well-developed country. This has merits as well as drawbacks. The latter are related to problems of personnel, software, equipment and facilities. The difficulties involved in forming a nuclear regulatory body, and the merits and difficulties in adopting foreign codes, standards and regulations required for such body to perform its tasks, are discussed in this paper. Discussions are applicable to many developing countries and particular emphasis is given to the conditions and practices in Egypt. (author)

  10. A symbiotic liaison between the genetic and epigenetic code

    Directory of Open Access Journals (Sweden)

    Holger eHeyn

    2014-05-01

    Full Text Available With rapid advances in sequencing technologies, we are undergoing a paradigm shift from hypothesis- to data-driven research. Genome-wide profiling efforts gave informative insights into biological processes; however, considering the wealth of variation, the major challenge remains their meaningful interpretation. In particular sequence variation in non-coding contexts is often challenging to interpret. Here, data integration approaches for the identification of functional genetic variability represent a likely solution. Exemplary, functional linkage analysis integrating genotype and expression data determined regulatory quantitative trait loci (QTL and proposed causal relationships. In addition to gene expression, epigenetic regulation and specifically DNA methylation was established as highly valuable surrogate mark for functional variance of the genetic code. Epigenetic modification served as powerful mediator trait to elucidate mechanisms forming phenotypes in health and disease. Particularly, integrative studies of genetic and DNA methylation data yet guided interpretation strategies of risk genotypes, but also proved their value for physiological traits, such as natural human variation and aging. This Perspective seeks to illustrate the power of data integration in the genomic era exemplified by DNA methylation quantitative trait loci (meQTLs. However, the model is further extendable to virtually all traceable molecular traits.

  11. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... documents from ICC's Chicago District Office: International Code Council, 4051 W Flossmoor Road, Country... Energy Conservation Code. International Existing Building Code. International Fire Code. International...

  12. On Francis Crick, the genetic code, and a clever kid.

    Science.gov (United States)

    Goldstein, Bob

    2018-04-02

    A few years ago, Francis Crick's son told me a story that I can't get out of my mind. I had contacted Michael Crick by email while digging through the background of the researchers who had cracked the genetic code in the 1960s. Francis had died in 2004, and I was contacting some of the people who knew him when he was struggling to decipher the code. Francis didn't appear to struggle often - he is known mostly for his successes - and, as it turns out, this one well-known struggle may have had a clue sitting just barely out of sight. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. ASME Section XI trends in developing nuclear codes and standards

    International Nuclear Information System (INIS)

    Hedden, O.F.

    1995-01-01

    When the author began working on nuclear power many years ago, he knew that perfection was the only acceptable technical standard. Unfortunately, this became an obsession with perfection that has had unfavorable consequences in some of the non-technical areas of work in ASME nuclear power Codes and Standards. However, the economic problems of the nuclear power industry now demand a more pragmatic approach if the industry is to continue. Not only does each item considered for action need to be evaluated to criteria that may in some cases be less than perfection, but one needs to consider whether it contributes tangibly to either safety or to reduction in technical or administrative burden. These should be the governing, criteria. The introduction of risk-based inspection methodologies will certainly be an important element in doing this successfully. One needs to consider these criteria collectively, as one discusses each item at the committee level, and individually, as one votes on each item. In the past, the author has been concerned that the industry was not acting quickly enough in taking advantage of opportunities offered by the Code to increase safety or to reduce cost. While he still has some concern, he thinks communication channels have been greatly improved. Now he is becoming more concerned with both the collective and individual actions that delay beneficial changes. The second part of the author's talk has to do with the relevance of the code committees in the nuclear power industry regulatory process

  14. Quantum control using genetic algorithms in quantum communication: superdense coding

    International Nuclear Information System (INIS)

    Domínguez-Serna, Francisco; Rojas, Fernando

    2015-01-01

    We present a physical example model of how Quantum Control with genetic algorithms is applied to implement the quantum superdense code protocol. We studied a model consisting of two quantum dots with an electron with spin, including spin-orbit interaction. The electron and the spin get hybridized with the site acquiring two degrees of freedom, spin and charge. The system has tunneling and site energies as time dependent control parameters that are optimized by means of genetic algorithms to prepare a hybrid Bell-like state used as a transmission channel. This state is transformed to obtain any state of the four Bell basis as required by superdense protocol to transmit two bits of classical information. The control process protocol is equivalent to implement one of the quantum gates in the charge subsystem. Fidelities larger than 99.5% are achieved for the hybrid entangled state preparation and the superdense operations. (paper)

  15. Improving a Power Line Communications Standard with LDPC Codes

    Directory of Open Access Journals (Sweden)

    Hsu Christine

    2007-01-01

    Full Text Available We investigate a power line communications (PLC scheme that could be used to enhance the HomePlug 1.0 standard, specifically its ROBO mode which provides modest throughput for the worst case PLC channel. The scheme is based on using a low-density parity-check (LDPC code, in lieu of the concatenated Reed-Solomon and convolutional codes in ROBO mode. The PLC channel is modeled with multipath fading and Middleton's class A noise. Clipping is introduced to mitigate the effect of impulsive noise. A simple and effective method is devised to estimate the variance of the clipped noise for LDPC decoding. Simulation results show that the proposed scheme outperforms the HomePlug 1.0 ROBO mode and has lower computational complexity. The proposed scheme also dispenses with the repetition of information bits in ROBO mode to gain time diversity, resulting in 4-fold increase in physical layer throughput.

  16. Assessing the Genetics Content in the Next Generation Science Standards.

    Directory of Open Access Journals (Sweden)

    Katherine S Lontok

    Full Text Available Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM. Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS using a consensus list of American Society of Human Genetics (ASHG core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  17. Assessing the Genetics Content in the Next Generation Science Standards.

    Science.gov (United States)

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  18. Electronic health record standards, coding systems, frameworks, and infrastructures

    CERN Document Server

    Sinha, Pradeep K; Bendale, Prashant; Mantri, Manisha; Dande, Atreya

    2013-01-01

    Discover How Electronic Health Records Are Built to Drive the Next Generation of Healthcare Delivery The increased role of IT in the healthcare sector has led to the coining of a new phrase ""health informatics,"" which deals with the use of IT for better healthcare services. Health informatics applications often involve maintaining the health records of individuals, in digital form, which is referred to as an Electronic Health Record (EHR). Building and implementing an EHR infrastructure requires an understanding of healthcare standards, coding systems, and frameworks. This book provides an

  19. Amino acid fermentation at the origin of the genetic code

    Directory of Open Access Journals (Sweden)

    de Vladar Harold P

    2012-02-01

    Full Text Available Abstract There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can

  20. Amino acid fermentation at the origin of the genetic code.

    Science.gov (United States)

    de Vladar, Harold P

    2012-02-10

    There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments

  1. Amino acid fermentation at the origin of the genetic code

    Science.gov (United States)

    2012-01-01

    There is evidence that the genetic code was established prior to the existence of proteins, when metabolism was powered by ribozymes. Also, early proto-organisms had to rely on simple anaerobic bioenergetic processes. In this work I propose that amino acid fermentation powered metabolism in the RNA world, and that this was facilitated by proto-adapters, the precursors of the tRNAs. Amino acids were used as carbon sources rather than as catalytic or structural elements. In modern bacteria, amino acid fermentation is known as the Stickland reaction. This pathway involves two amino acids: the first undergoes oxidative deamination, and the second acts as an electron acceptor through reductive deamination. This redox reaction results in two keto acids that are employed to synthesise ATP via substrate-level phosphorylation. The Stickland reaction is the basic bioenergetic pathway of some bacteria of the genus Clostridium. Two other facts support Stickland fermentation in the RNA world. First, several Stickland amino acid pairs are synthesised in abiotic amino acid synthesis. This suggests that amino acids that could be used as an energy substrate were freely available. Second, anticodons that have complementary sequences often correspond to amino acids that form Stickland pairs. The main hypothesis of this paper is that pairs of complementary proto-adapters were assigned to Stickland amino acids pairs. There are signatures of this hypothesis in the genetic code. Furthermore, it is argued that the proto-adapters formed double strands that brought amino acid pairs into proximity to facilitate their mutual redox reaction, structurally constraining the anticodon pairs that are assigned to these amino acid pairs. Significance tests which randomise the code are performed to study the extent of the variability of the energetic (ATP) yield. Random assignments can lead to a substantial yield of ATP and maintain enough variability, thus selection can act and refine the assignments

  2. Decoding the non-coding genome: elucidating genetic risk outside the coding genome.

    Science.gov (United States)

    Barr, C L; Misener, V L

    2016-01-01

    Current evidence emerging from genome-wide association studies indicates that the genetic underpinnings of complex traits are likely attributable to genetic variation that changes gene expression, rather than (or in combination with) variation that changes protein-coding sequences. This is particularly compelling with respect to psychiatric disorders, as genetic changes in regulatory regions may result in differential transcriptional responses to developmental cues and environmental/psychosocial stressors. Until recently, however, the link between transcriptional regulation and psychiatric genetic risk has been understudied. Multiple obstacles have contributed to the paucity of research in this area, including challenges in identifying the positions of remote (distal from the promoter) regulatory elements (e.g. enhancers) and their target genes and the underrepresentation of neural cell types and brain tissues in epigenome projects - the availability of high-quality brain tissues for epigenetic and transcriptome profiling, particularly for the adolescent and developing brain, has been limited. Further challenges have arisen in the prediction and testing of the functional impact of DNA variation with respect to multiple aspects of transcriptional control, including regulatory-element interaction (e.g. between enhancers and promoters), transcription factor binding and DNA methylation. Further, the brain has uncommon DNA-methylation marks with unique genomic distributions not found in other tissues - current evidence suggests the involvement of non-CG methylation and 5-hydroxymethylation in neurodevelopmental processes but much remains unknown. We review here knowledge gaps as well as both technological and resource obstacles that will need to be overcome in order to elucidate the involvement of brain-relevant gene-regulatory variants in genetic risk for psychiatric disorders. © 2015 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  3. Regulatory Endorsement Activities for ASME Nuclear Codes and Standards

    International Nuclear Information System (INIS)

    West, Raymond A.

    2006-01-01

    The ASME Board on Nuclear Codes and Standards (BNCS) has formed a Task Group on Regulatory Endorsement (TG-RE) that is currently in discussions with the United States Nuclear Regulatory Commission (NRC) to look at suggestions and recommendations that can be used to help with the endorsement of new and revised ASME Nuclear Codes and Standards (NC and S). With the coming of new reactors in the USA in the very near future we need to look at both the regulations and all the ASME NC and S to determine where we need to make changes to support these new plants. At the same time it is important that we maintain our operating plants while addressing ageing management needs of our existing reactors. This is going to take new thinking, time, resources, and money. For all this to take place the regulations and requirements that we use must be clear concise and necessary for safety and to that end both the NRC and ASME are working together to make this happen. Because of the influence that the USA has in the world in dealing with these issues, this paper is written to inform the international nuclear engineering community about the issues and what actions are being addressed under this effort. (author)

  4. Standardization of computer programs - basis of the Czechoslovak library of nuclear codes

    International Nuclear Information System (INIS)

    Gregor, M.

    1987-01-01

    A standardized form of computer code documentation has been established in the CSSR in the field of reactor safety. Structure and content of the documentation are described and codes already subject to this process are mentioned. The formation of a Czechoslovak nuclear code library and facilitated discussion of safety reports containing results of standardized codes are aimed at

  5. Fifty years of progress in speech coding standards

    Science.gov (United States)

    Cox, Richard

    2004-10-01

    Over the past 50 years, speech coding has taken root worldwide. Early applications were for the military and transmission for telephone networks. The military gave equal priority to intelligibility and low bit rate. The telephone network gave priority to high quality and low delay. These illustrate three of the four areas in which requirements must be set for any speech coder application: bit rate, quality, delay, and complexity. While the military could afford relatively expensive terminal equipment for secure communications, the telephone network needed low cost for massive deployment in switches and transmission equipment worldwide. Today speech coders are at the heart of the wireless phones and telephone answering systems we use every day. In addition to the technology and technical invention that has occurred, standards make it possible for all these different systems to interoperate. The primary areas of standardization are the public switched telephone network, wireless telephony, and secure telephony for government and military applications. With the advent of IP telephony there are additional standardization efforts and challenges. In this talk the progress in all areas is reviewed as well as a reflection on Jim Flanagan's impact on this field during the past half century.

  6. Multidimensional electron-photon transport with standard discrete ordinates codes

    International Nuclear Information System (INIS)

    Drumm, C.R.

    1997-01-01

    A method is described for generating electron cross sections that are comparable with standard discrete ordinates codes without modification. There are many advantages of using an established discrete ordinates solver, e.g. immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and man-made radiation environments. The cross sections have been successfully used in the DORT, TWODANT and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electron-photon transport problems. The key to the method is a simultaneous solution of the continuous-slowing-down (CSD) portion and elastic-scattering portion of the scattering source by the Goudsmit-Saunderson theory. The resulting multigroup-Legendre cross sections are much smaller than the true scattering cross sections that they represent. Under certain conditions, the cross sections are guaranteed positive and converge with a low-order Legendre expansion

  7. Multidimensional electron-photon transport with standard discrete ordinates codes

    International Nuclear Information System (INIS)

    Drumm, C.R.

    1997-01-01

    A method is described for generating electron cross sections that are compatible with standard discrete ordinates codes without modification. There are many advantages to using an established discrete ordinates solver, e.g., immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and synthetic radiation environments. The cross sections have been successfully used in the DORT, TWODANT, and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electron-photon transport problems. The key to the method is a simultaneous solution of the continuous-slowing-down and elastic-scattering portions of the scattering source by the Goudsmit-Saunderson theory. The resulting multigroup-Legendre cross sections are much smaller than the true scattering cross sections that they represent. Under certain conditions, the cross sections are guaranteed positive and converge with a low-order Legendre expansion

  8. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  9. Are industry codes and standards a valid cost containment approach

    International Nuclear Information System (INIS)

    Rowley, C.W.; Simpson, G.T.; Young, R.K.

    1990-01-01

    The nuclear industry has historically concentrated on safety design features for many years, but recently has been shifting to the reliability of the operating systems and components. The Navy has already gone through this transition and has found that Reliability Centered Maintenance (RCM) is an invaluable tool to improve the reliability of components, systems, ships, and classes of ships. There is a close correlation of Navy ships and equipment to commercial nuclear power plants and equipment. The Navy has a central engineering and configuration management organization (Naval Sea Systems Command) for over 500 ships, where as the over 100 commercial nuclear power plants and 52 nuclear utilities represent a fragmented owner/management structure. This paper suggests that the results of the application of RCM in the Navy can be duplicated to a large degree in the commercial nuclear power industry by the development and utilization of nuclear codes and standards

  10. Arbitrariness is not enough: towards a functional approach to the genetic code.

    Science.gov (United States)

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  11. Codes and standards an European point of view

    International Nuclear Information System (INIS)

    Roche, R.L.; Corsi, F.

    1987-01-01

    The first part of this paper is related to the European situation in which Construction Codes for FBR components are developed. Attention is given to the different agreements between European Countries. After a description of the present state of Codes development, indications are given on the future work in this field. Several appendix are devoted to the state of Codes in different European Countries and to the action of European Commission

  12. Cross-index to DOE-prescribed occupational safety codes and standards

    International Nuclear Information System (INIS)

    1982-01-01

    A compilation of detailed information from more than three hundred and fifty DOE-prescribed or OSHA-referenced industrial safety codes and standards is presented. Condensed data from individual code portions are listed according to reference code, section, paragraph and page. A glossary of letter initials/abbreviations for the organizations or documents whose codes or standards are contained in this Cross-Index, is listed

  13. 1 CFR 21.14 - Deviations from standard organization of the Code of Federal Regulations.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Deviations from standard organization of the... CODIFICATION General Numbering § 21.14 Deviations from standard organization of the Code of Federal Regulations. (a) Any deviation from standard Code of Federal Regulations designations must be approved in advance...

  14. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  15. Final Report. An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Rosenthal, Andrew [New Mexico State Univ., Las Cruces, NM (United States)

    2013-12-30

    The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stove pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.

  16. Cross-index to DOE-prescribed occupational safety codes and standards

    International Nuclear Information System (INIS)

    1981-01-01

    This Cross-Index volume is the 1981 compilation of detailed information from more than three hundred and fifty DOE prescribed or OSHA referenced industrial safety codes and standards and is revised yearly to provide information from current codes. Condensed data from individual code portions are listed according to reference code, section, paragraph and page. Each code is given a two-digit reference code number or letter in the Contents section (pages C to L) of this volume. This reference code provides ready identification of any code listed in the Cross-Index. The computerized information listings are on the left-hand portion of Cross-Index page; in order to the right of the listing are the reference code letters or numbers, the section, paragraph and page of the referenced code containing expanded information on the individual listing

  17. The "periodic table" of the genetic code: A new way to look at the code and the decoding process.

    Science.gov (United States)

    Komar, Anton A

    2016-01-01

    Henri Grosjean and Eric Westhof recently presented an information-rich, alternative view of the genetic code, which takes into account current knowledge of the decoding process, including the complex nature of interactions between mRNA, tRNA and rRNA that take place during protein synthesis on the ribosome, and it also better reflects the evolution of the code. The new asymmetrical circular genetic code has a number of advantages over the traditional codon table and the previous circular diagrams (with a symmetrical/clockwise arrangement of the U, C, A, G bases). Most importantly, all sequence co-variances can be visualized and explained based on the internal logic of the thermodynamics of codon-anticodon interactions.

  18. The Codex standard and code for irradiated foods

    International Nuclear Information System (INIS)

    Erwin, L.

    1985-01-01

    A brief background on the work by the Codex Alimentarius Commission on irradiated foods is given. An Australian model food standard for irradiated foods, based on the Codex standard, is being developed

  19. Development of standards, codes of practice and guidelines at the national level

    International Nuclear Information System (INIS)

    Swindon, T.N.

    1989-01-01

    Standards, codes of practice and guidelines are defined and their different roles in radiation protection specified. The work of the major bodies that develop such documents in Australia - the National Health and Medical Research Council and the Standards Association of Australia - is discussed. The codes of practice prepared under the Environment Protection (Nuclear Codes) Act, 1978, an act of the Australian Federal Parliament, are described and the guidelines associated with them outlined. 5 refs

  20. How American Nurses Association Code of Ethics informs genetic/genomic nursing.

    Science.gov (United States)

    Tluczek, Audrey; Twal, Marie E; Beamer, Laura Curr; Burton, Candace W; Darmofal, Leslie; Kracun, Mary; Zanni, Karen L; Turner, Martha

    2018-01-01

    Members of the Ethics and Public Policy Committee of the International Society of Nurses in Genetics prepared this article to assist nurses in interpreting the American Nurses Association (2015) Code of Ethics for Nurses with Interpretive Statements (Code) within the context of genetics/genomics. The Code explicates the nursing profession's norms and responsibilities in managing ethical issues. The nearly ubiquitous application of genetic/genomic technologies in healthcare poses unique ethical challenges for nursing. Therefore, authors conducted literature searches that drew from various professional resources to elucidate implications of the code in genetic/genomic nursing practice, education, research, and public policy. We contend that the revised Code coupled with the application of genomic technologies to healthcare creates moral obligations for nurses to continually refresh their knowledge and capacities to translate genetic/genomic research into evidence-based practice, assure the ethical conduct of scientific inquiry, and continually develop or revise national/international guidelines that protect the rights of individuals and populations within the context of genetics/genomics. Thus, nurses have an ethical responsibility to remain knowledgeable about advances in genetics/genomics and incorporate emergent evidence into their work.

  1. OECD International Standard Problem number 34. Falcon code comparison report

    International Nuclear Information System (INIS)

    Williams, D.A.

    1994-12-01

    ISP-34 is the first ISP to address fission product transport issues and has been strongly supported by a large number of different countries and organisations. The ISP is based on two experiments, FAL-ISP-1 and FAL-ISP-2, which were conducted in AEA's Falcon facility. Specific features of the experiments include quantification of chemical effects and aerosol behaviour. In particular, multi-component aerosol effects and vapour-aerosol interactions can all be investigated in the Falcon facility. Important parameters for participants to predict were the deposition profiles and composition, key chemical species and reactions, evolution of suspended material concentrations, and the effects of steam condensation onto aerosols and particle hygroscopicity. The results of the Falcon ISP support the belief that aerosol physics is generally well modelled in primary circuit codes, but the chemistry models in many of the codes need to be improved, since chemical speciation is one of the main factors which controls transport and deposition behaviour. The importance of chemical speciation, aerosol nucleation, and the role of multi-component aerosols in determining transport and deposition behaviour are evident. The role of re-vaporization in these Falcon experiments is not clear; it is not possible to compare those codes which predicted re-vaporization with quantitative data. The evidence from this ISP exercise indicates that the containment codes can predict thermal-hydraulics conditions satisfactorily. However, the differences in the predicted aerosol locations in the Falcon tests had shown that aerosol behaviour was very susceptible to parameters such as particle size distribution

  2. Junk DNA and the long non-coding RNA twist in cancer genetics

    NARCIS (Netherlands)

    H. Ling (Hui); K. Vincent; M. Pichler; R. Fodde (Riccardo); I. Berindan-Neagoe (Ioana); F.J. Slack (Frank); G.A. Calin (George)

    2015-01-01

    textabstractThe central dogma of molecular biology states that the flow of genetic information moves from DNA to RNA to protein. However, in the last decade this dogma has been challenged by new findings on non-coding RNAs (ncRNAs) such as microRNAs (miRNAs). More recently, long non-coding RNAs

  3. Community standards for genomic resources, genetic conservation, and data integration

    Science.gov (United States)

    Jill Wegrzyn; Meg Staton; Emily Grau; Richard Cronn; C. Dana Nelson

    2017-01-01

    Genetics and genomics are increasingly important in forestry management and conservation. Next generation sequencing can increase analytical power, but still relies on building on the structure of previously acquired data. Data standards and data sharing allow the community to maximize the analytical power of high throughput genomics data. The landscape of incomplete...

  4. Codes and standards and other guidance cited in regulatory documents. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Ankrum, A.; Nickolaus, J.; Vinther, R.; Maguire-Moffitt, N.; Hammer, J.; Sherfey, L.; Warner, R. [Pacific Northwest Lab., Richland, WA (United States)

    1994-08-01

    As part of the US Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program, Pacific Northwest Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. In addition to updating previous information, Revision 1 adds citations from the NRC Inspection Manual and the Improved Standard Technical Specifications. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC`s Bulletins, Information Notices, Circulars, Generic Letters, Policy Statements, Regulatory Guides, and the Standard Review Plan (NUREG-0800).

  5. Codes and standards and other guidance cited in regulatory documents. Revision 1

    International Nuclear Information System (INIS)

    Ankrum, A.; Nickolaus, J.; Vinther, R.; Maguire-Moffitt, N.; Hammer, J.; Sherfey, L.; Warner, R.

    1994-08-01

    As part of the US Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program, Pacific Northwest Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. In addition to updating previous information, Revision 1 adds citations from the NRC Inspection Manual and the Improved Standard Technical Specifications. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC's Bulletins, Information Notices, Circulars, Generic Letters, Policy Statements, Regulatory Guides, and the Standard Review Plan (NUREG-0800)

  6. 77 FR 34020 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2012-06-08

    .... (CO) Detection and Warning Equipment. NFPA 790--2012 Standard for Competency of Third-Party Field 6/22... Health-Related Fitness Programs for 1/4/2013. Fire Department Members. NFPA 1584--2008 Standard on the...

  7. National Society of Genetic Counselors Code of Ethics: Explication of 2017 Revisions.

    Science.gov (United States)

    Senter, Leigha; Bennett, Robin L; Madeo, Anne C; Noblin, Sarah; Ormond, Kelly E; Schneider, Kami Wolfe; Swan, Kelli; Virani, Alice

    2018-02-01

    The Code of Ethics (COE) of the National Society of Genetic Counselors (NSGC) was adopted in 1992 and was later revised and adopted in 2006. In 2016, the NSGC Code of Ethics Review Task Force (COERTF) was convened to review the COE. The COERTF reviewed ethical codes written by other professional organizations and suggested changes that would better reflect the current and evolving nature of the genetic counseling profession. The COERTF received input from the society's legal counsel, Board of Directors, and members-at-large. A revised COE was proposed to the membership and approved and adopted in April 2017. The revisions and rationale for each are presented.

  8. Numerical analysis and nuclear standard code application to thermal fatigue

    International Nuclear Information System (INIS)

    Merola, M.

    1992-01-01

    The present work describes the Joint Research Centre Ispra contribution to the IAEA benchmark exercise 'Lifetime Behaviour of the First Wall of Fusion Machines'. The results of the numerical analysis of the reference thermal fatigue experiment are presented. Then a discussion on the numerical analysis of thermal stress is tackled, pointing out its particular aspects in view of their influence on the stress field evaluation. As far as the design-allowable number of cycles are concerned the American nuclear code ASME and the French code RCC-MR are applied and the reasons for the different results obtained are investigated. As regards a realistic fatigue lifetime evaluation, the main problems to be solved are brought out. This work, is intended as a preliminary basis for a discussion focusing on the main characteristics of the thermal fatigue problem from both a numerical and a lifetime assessment point of view. In fact the present margin of discretion left to the analyst may cause undue discrepancies in the results obtained. A sensitivity analysis of the main parameters involved is desirable and more precise design procedures should be stated

  9. Efficient Dual Domain Decoding of Linear Block Codes Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed Azouaoui

    2012-01-01

    Full Text Available A computationally efficient algorithm for decoding block codes is developed using a genetic algorithm (GA. The proposed algorithm uses the dual code in contrast to the existing genetic decoders in the literature that use the code itself. Hence, this new approach reduces the complexity of decoding the codes of high rates. We simulated our algorithm in various transmission channels. The performance of this algorithm is investigated and compared with competitor decoding algorithms including Maini and Shakeel ones. The results show that the proposed algorithm gives large gains over the Chase-2 decoding algorithm and reach the performance of the OSD-3 for some quadratic residue (QR codes. Further, we define a new crossover operator that exploits the domain specific information and compare it with uniform and two point crossover. The complexity of this algorithm is also discussed and compared to other algorithms.

  10. Cross-Index to DOE-prescribed industrial safety codes and standards

    International Nuclear Information System (INIS)

    1980-01-01

    This Cross-Index volume is the 1980 compilation of detailed information from more than two hundred and ninety Department of Energy (DOE) prescribed or Occupational Health and Safety Administration (OSHA) referenced industrial safety codes and standards. The compilation of this material was conceived and initiated in 1973, and is revised yearly to provide information from current codes. Condensed data from individual code portions are listed according to reference code, section, paragraph, and page. Each code is given a two-digit reference code number or letter in the Contents section. This reference code provides ready identification of any code listed in the Cross-Index. The computerized information listings are on the left-hand portion of Cross-Index page; in order to the right of the listing are the reference code letters or numbers, the section, paragraph, and page of the referenced code containing expanded information on the individual listing. Simplified How to Use directions are listed. A glossary of letter initials/abbreviations for the organizations or documents, whose codes or standards are contained in this Cross-Index, is included

  11. 77 FR 67628 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2012-11-13

    ... Standard on Fire and 7/8/2013 Life Safety in Animal Housing Facilities. NFPA 160--2011 Standard for the Use... five years in Revision Cycles that begin twice each year and take approximately two years to complete. Each Revision Cycle proceeds according to a published schedule that includes final dates for all major...

  12. 76 FR 70414 - National Fire Protection Association (NFPA) Proposes To Revise Codes and Standards

    Science.gov (United States)

    2011-11-14

    ... Commercial Cooking Operations. NFPA 99--2012 Health Care Facilities Code 6/22/2012 NFPA 99B--2010 Standard... Explosion Investigations..... 1/4/2012 NFPA 1005--2007 Standard for Professional Qualifications for 1/4/2012 Marine Fire Fighting for Land-Based Fire Fighters. NFPA 1021--2009 Standard for Fire Officer Professional...

  13. Interband coding extension of the new lossless JPEG standard

    Science.gov (United States)

    Memon, Nasir D.; Wu, Xiaolin; Sippy, V.; Miller, G.

    1997-01-01

    Due to the perceived inadequacy of current standards for lossless image compression, the JPEG committee of the International Standards Organization (ISO) has been developing a new standard. A baseline algorithm, called JPEG-LS, has already been completed and is awaiting approval by national bodies. The JPEG-LS baseline algorithm despite being simple is surprisingly efficient, and provides compression performance that is within a few percent of the best and more sophisticated techniques reported in the literature. Extensive experimentations performed by the authors seem to indicate that an overall improvement by more than 10 percent in compression performance will be difficult to obtain even at the cost of great complexity; at least not with traditional approaches to lossless image compression. However, if we allow inter-band decorrelation and modeling in the baseline algorithm, nearly 30 percent improvement in compression gains for specific images in the test set become possible with a modest computational cost. In this paper we propose and investigate a few techniques for exploiting inter-band correlations in multi-band images. These techniques have been designed within the framework of the baseline algorithm, and require minimal changes to the basic architecture of the baseline, retaining its essential simplicity.

  14. Open Genetic Code: on open source in the life sciences

    OpenAIRE

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first ...

  15. Symmetries in Genetic Systems and the Concept of Geno-Logical Coding

    Directory of Open Access Journals (Sweden)

    Sergey V. Petoukhov

    2016-12-01

    Full Text Available The genetic code of amino acid sequences in proteins does not allow understanding and modeling of inherited processes such as inborn coordinated motions of living bodies, innate principles of sensory information processing, quasi-holographic properties, etc. To be able to model these phenomena, the concept of geno-logical coding, which is connected with logical functions and Boolean algebra, is put forward. The article describes basic pieces of evidence in favor of the existence of the geno-logical code, which exists in p­arallel with the known genetic code of amino acid sequences but which serves for transferring inherited processes along chains of generations. These pieces of evidence have been received due to the analysis of symmetries in structures of molecular-genetic systems. The analysis has revealed a close connection of the genetic system with dyadic groups of binary numbers and with other mathematical objects, which are related with dyadic groups: Walsh functions (which are algebraic characters of dyadic groups, bit-reversal permutations, logical holography, etc. These results provide a new approach for mathematical modeling of genetic structures, which uses known mathematical formalisms from technological fields of noise-immunity coding of information, binary analysis, logical holography, and digital devices of artificial intellect. Some opportunities for a development of algebraic-logical biology are opened.

  16. The Graph, Geometry and Symmetries of the Genetic Code with Hamming Metric

    Directory of Open Access Journals (Sweden)

    Reijer Lenstra

    2015-07-01

    Full Text Available The similarity patterns of the genetic code result from similar codons encoding similar messages. We develop a new mathematical model to analyze these patterns. The physicochemical characteristics of amino acids objectively quantify their differences and similarities; the Hamming metric does the same for the 64 codons of the codon set. (Hamming distances equal the number of different codon positions: AAA and AAC are at 1-distance; codons are maximally at 3-distance. The CodonPolytope, a 9-dimensional geometric object, is spanned by 64 vertices that represent the codons and the Euclidian distances between these vertices correspond one-to-one with intercodon Hamming distances. The CodonGraph represents the vertices and edges of the polytope; each edge equals a Hamming 1-distance. The mirror reflection symmetry group of the polytope is isomorphic to the largest permutation symmetry group of the codon set that preserves Hamming distances. These groups contain 82,944 symmetries. Many polytope symmetries coincide with the degeneracy and similarity patterns of the genetic code. These code symmetries are strongly related with the face structure of the polytope with smaller faces displaying stronger code symmetries. Splitting the polytope stepwise into smaller faces models an early evolution of the code that generates this hierarchy of code symmetries. The canonical code represents a class of 41,472 codes with equivalent symmetries; a single class among an astronomical number of symmetry classes comprising all possible codes.

  17. Boltzmann-Fokker-Planck calculations using standard discrete-ordinates codes

    International Nuclear Information System (INIS)

    Morel, J.E.

    1987-01-01

    The Boltzmann-Fokker-Planck (BFP) equation can be used to describe both neutral and charged-particle transport. Over the past several years, the author and several collaborators have developed methods for representing Fokker-Planck operators with standard multigroup-Legendre cross-section data. When these data are input to a standard S/sub n/ code such as ONETRAN, the code actually solves the Boltzmann-Fokker-Planck equation rather than the Boltzmann equation. This is achieved wihout any modification to the S/sub n/ codes. Because BFP calculations can be more demanding from a numerical viewpoint than standard neutronics calculations, we have found it useful to implement new quadrature methods ad convergence acceleration methods in the standard discrete-ordinates code, ONETRAN. We discuss our BFP cross-section representation techniques, our improved quadrature and acceleration techniques, and present results from BFP coupled electron-photon transport calculations performed with ONETRAN. 19 refs., 7 figs

  18. Hydrogen Codes and Standards: An Overview of U.S. DOE Activities

    International Nuclear Information System (INIS)

    James M Ohi

    2006-01-01

    The Hydrogen, Fuel Cells, and Infrastructure Technologies (HFCIT) Program of the U.S. Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL), with the help of leading standards and model code development organizations, other national laboratories, and key stakeholders, are developing a coordinated and collaborative government-industry effort to prepare, review, and promulgate hydrogen codes and standards needed to expedite hydrogen infrastructure development. The focus of this effort is to put in place a coordinated and comprehensive hydrogen codes and standards program at the national and international levels. This paper updates an overview of the U.S. program to facilitate and coordinate the development of hydrogen codes and standards that was presented by the author at WHEC 15. (authors)

  19. Guidelines on Active Content and Mobile Code: Recommendations of the National Institute of Standards and Technology

    National Research Council Canada - National Science Library

    Jansen, Wayne

    2001-01-01

    .... One such category of technologies is active content. Broadly speaking, active content refers to electronic documents that, unlike past character documents based on the American Standard Code for Information Interchange (ASCII...

  20. Licensing procedure, nuclear codes and standards in the Federal Republic of Germany

    International Nuclear Information System (INIS)

    Schultheiss, G.F.

    1980-01-01

    The present paper deals with legal background of licensing in nuclear technology and atomic energy use, licensing procedures for nuclear power plants and with codes, standards and guidelines in the Federal Republic of Germany. (orig./RW)

  1. NODC Standard Format Marine Toxic Substances and Pollutants (F144) chemical identification codes (NODC Accession 9200273)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This archival information package contains a listing of codes and chemical names that were used in NODC Standard Format Marine Toxic Substances and Pollutants (F144)...

  2. Synthetic alienation of microbial organisms by using genetic code engineering: Why and how?

    Science.gov (United States)

    Kubyshkin, Vladimir; Budisa, Nediljko

    2017-08-01

    The main goal of synthetic biology (SB) is the creation of biodiversity applicable for biotechnological needs, while xenobiology (XB) aims to expand the framework of natural chemistries with the non-natural building blocks in living cells to accomplish artificial biodiversity. Protein and proteome engineering, which overcome limitation of the canonical amino acid repertoire of 20 (+2) prescribed by the genetic code by using non-canonic amino acids (ncAAs), is one of the main focuses of XB research. Ideally, estranging the genetic code from its current form via systematic introduction of ncAAs should enable the development of bio-containment mechanisms in synthetic cells potentially endowing them with a "genetic firewall" i.e. orthogonality which prevents genetic information transfer to natural systems. Despite rapid progress over the past two decades, it is not yet possible to completely alienate an organism that would use and maintain different genetic code associations permanently. In order to engineer robust bio-contained life forms, the chemical logic behind the amino acid repertoire establishment should be considered. Starting from recent proposal of Hartman and Smith about the genetic code establishment in the RNA world, here the authors mapped possible biotechnological invasion points for engineering of bio-contained synthetic cells equipped with non-canonical functionalities. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. IAEA Workshop (Training Course) on Codes and Standards for Sodium Cooled Fast Reactors. Working Material

    International Nuclear Information System (INIS)

    2010-01-01

    The training course consisted of lectures and Q&A sessions. The lectures dealt with the history of the development of Design Codes and Standards for Sodium Cooled Fast Reactors (SFRs) in the respective country, the detailed description of the current design Codes and Standards for SFRs and their application to ongoing Fast Reactor design projects, as well as the ongoing development work and plans for the future in this area. Annex 1 contains the detailed Workshop program

  4. Standard problems to evaluate soil structure interaction computer codes

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Philippacopoulos, A.J.

    1979-01-01

    The seismic response of nuclear power plant structures is often calculated using lumped parameter methods. A finite element model of the structure is coupled to the soil with a spring-dashpot system used to represent the interaction process. The parameters of the interaction model are based on analytic solutions to simple problems which are idealizations of the actual problems of interest. The objective of the work reported in this paper is to compare predicted responses using the standard lumped parameter models with experimental data. These comparisons are shown to be good for a fairly uniform soil system and for loadings which do not result in nonlinear interaction effects such as liftoff. 7 references, 7 figures

  5. The Impact of Diagnostic Code Misclassification on Optimizing the Experimental Design of Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Steven J. Schrodi

    2017-01-01

    Full Text Available Diagnostic codes within electronic health record systems can vary widely in accuracy. It has been noted that the number of instances of a particular diagnostic code monotonically increases with the accuracy of disease phenotype classification. As a growing number of health system databases become linked with genomic data, it is critically important to understand the effect of this misclassification on the power of genetic association studies. Here, I investigate the impact of this diagnostic code misclassification on the power of genetic association studies with the aim to better inform experimental designs using health informatics data. The trade-off between (i reduced misclassification rates from utilizing additional instances of a diagnostic code per individual and (ii the resulting smaller sample size is explored, and general rules are presented to improve experimental designs.

  6. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    Directory of Open Access Journals (Sweden)

    Cevdet Kızıl

    2014-08-01

    Full Text Available The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commercial code and Turkish accounting standards?”, “Do accounting academicians integrate new Turkish commercial code and Turkish accounting standards to their lectures?”, “How does modern accounting education methodology and technology coincides with the teaching of new Turkish commercial code and Turkish accounting standards?”, “Do universities offer mandatory and elective courses which cover the new Turkish commercial code and Turkish accounting standards?” and “If such courses are offered, what are their names, percentage in the curriculum and degree of coverage?”Research contributes to the literature in several ways. Firstly, new Turkish commercial code and Turkish accounting standards are current significant topics for the accounting profession. Furthermore, the accounting education provides a basis for the implementations in public and private sector. Besides, one of the intentions of new Turkish commercial code and Turkish accounting standards is to foster transparency. That is definitely a critical concept also in terms of mergers, acquisitions and investments. Stakeholders of today’s business world such as investors, shareholders, entrepreneurs, auditors and government are in need of more standardized global accounting principles Thus, revision and redesigning of accounting educations plays an important role. Emphasized points also clearly prove the necessity and functionality of this research.

  7. Genetic Programming and Standardization in Water Temperature Modelling

    Directory of Open Access Journals (Sweden)

    Maritza Arganis

    2009-01-01

    Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.

  8. Safety, codes and standards for hydrogen installations. Metrics development and benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Aaron P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dedrick, Daniel E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); San Marchi, Christopher W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-04-01

    Automakers and fuel providers have made public commitments to commercialize light duty fuel cell electric vehicles and fueling infrastructure in select US regions beginning in 2014. The development, implementation, and advancement of meaningful codes and standards is critical to enable the effective deployment of clean and efficient fuel cell and hydrogen solutions in the energy technology marketplace. Metrics pertaining to the development and implementation of safety knowledge, codes, and standards are important to communicate progress and inform future R&D investments. This document describes the development and benchmarking of metrics specific to the development of hydrogen specific codes relevant for hydrogen refueling stations. These metrics will be most useful as the hydrogen fuel market transitions from pre-commercial to early-commercial phases. The target regions in California will serve as benchmarking case studies to quantify the success of past investments in research and development supporting safety codes and standards R&D.

  9. Review of ASME nuclear codes and standards- subcommittee on repairs, replacements, and modifications

    International Nuclear Information System (INIS)

    Mawson, T.J.

    1990-01-01

    As requested by the ASME board on Nuclear Codes and Standards, the Pressure Vessel Research Committee initiated a project to review Sections III and XI of the ASME Boiler and Pressure Vessel Code for the purposes of improving, clarifying, providing transition, consistency, compatibility, and simplifying code requirements. The project was organized with six subcommittees to address various Code activities: design; tests and examinations; documentation; quality assurance; repair, replacement and modification; and general requirements. This paper discusses how the subcommittee on repair, replacement and modification was organized to review the repair, replacement and modification requirements of the ASME boiler and pressure vessel code, Section III and Section XI for Class 1, 2, and 3 and MC components and their supports, and other documents of the nuclear industry related to the repair, replacement and modification requirements of the ASME code

  10. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    Energy Technology Data Exchange (ETDEWEB)

    Ball, J.; Glowa, G.; Wren, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Ewig, F. [GRS Koln (Germany); Dickenson, S. [AEAT, (United Kingdom); Billarand, Y.; Cantrel, L. [IPSN (France); Rydl, A. [NRIR (Czech Republic); Royen, J. [OECD/NEA (France)

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I{sup -} concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  11. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I - concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  12. 76 FR 70413 - National Fire Protection Association (NFPA): Request for Comments on NFPA's Codes and Standards

    Science.gov (United States)

    2011-11-14

    ... Private Fire Protection. P NFPA 36 Standard for Solvent Extraction Plants P NFPA 52 Vehicular Gaseous Fuel Systems Code P NFPA 67 Guideline on Explosion Protection for Gaseous N Mixtures in Pipe Systems. NFPA 68 Standard on Explosion Protection by Deflagration P Venting. NFPA 70B Recommended Practice for Electrical...

  13. R&D for Safety Codes and Standards: Materials and Components Compatibility

    Energy Technology Data Exchange (ETDEWEB)

    Somerday, Brian P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFleur, Chris [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Marchi, Chris San [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-08-01

    This project addresses the following technical barriers from the Safety, Codes and Standards section of the 2012 Fuel Cell Technologies Office Multi-Year Research, Development and Demonstration Plan (section 3.8): (A) Safety data and information: limited access and availability (F) Enabling national and international markets requires consistent RCS (G) Insufficient technical data to revise standards.

  14. Open Genetic Code: on open source in the life sciences.

    Science.gov (United States)

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  15. Safety standards, legislation and codes of practice for fuel cell manufacture and operation

    Energy Technology Data Exchange (ETDEWEB)

    Wilcox, C.P.

    1999-07-01

    This report examines safety standards, legislation and codes of practice for fuel cell manufacture and operation in the UK, Europe and internationally. Management of health and safety in the UK is discussed, and the characteristics of phosphoric acid (PAFC), proton exchange membrane (PEM), molten carbonate (MCFC), solid oxide (SOFC) fuel cells are described. Fuel cell power plant standards and manufacture in the UK, design and operational considerations, end of life disposal, automotive fuel cell system, and fuelling and vehicular concerns are explored, and standards, legislation and codes of practice are explained in the appendix.

  16. Review and Implementation of the Emerging CCSDS Recommended Standard for Multispectral and Hyperspectral Lossless Image Coding

    Science.gov (United States)

    Sanchez, Jose Enrique; Auge, Estanislau; Santalo, Josep; Blanes, Ian; Serra-Sagrista, Joan; Kiely, Aaron

    2011-01-01

    A new standard for image coding is being developed by the MHDC working group of the CCSDS, targeting onboard compression of multi- and hyper-spectral imagery captured by aircraft and satellites. The proposed standard is based on the "Fast Lossless" adaptive linear predictive compressor, and is adapted to better overcome issues of onboard scenarios. In this paper, we present a review of the state of the art in this field, and provide an experimental comparison of the coding performance of the emerging standard in relation to other state-of-the-art coding techniques. Our own independent implementation of the MHDC Recommended Standard, as well as of some of the other techniques, has been used to provide extensive results over the vast corpus of test images from the CCSDS-MHDC.

  17. [Direct genetic manipulation and criminal code in Venezuela: absolute criminal law void?].

    Science.gov (United States)

    Cermeño Zambrano, Fernando G De J

    2002-01-01

    The judicial regulation of genetic biotechnology applied to the human genome is of big relevance currently in Venezuela due to the drafting of an innovative bioethical law in the country's parliament. This article will highlight the constitutional normative of Venezuela's 1999 Constitution regarding this subject, as it establishes the framework from which this matter will be legally regulated. The approach this article makes towards the genetic biotechnology applied to the human genome is made taking into account the Venezuelan penal law and by highlighting the violent genetic manipulations that have criminal relevance. The genetic biotechnology applied to the human genome has another important relevance as a consequence of the reformulation of the Venezuelan Penal Code discussed by the country's National Assembly. Therefore, a concise study of the country's penal code will be made in this article to better understand what judicial-penal properties have been protected by the Venezuelan penal legislation. This last step will enable us to identify the penal tools Venezuela counts on to face direct genetic manipulations. We will equally indicate the existing punitive loophole and that should be covered by the penal legislator. In conclusion, this essay concerns criminal policy, referred to the direct genetic manipulations on the human genome that haven't been typified in Venezuelan law, thus discovering a genetic biotechnology paradise.

  18. Unassigned Codons, Nonsense Suppression, and Anticodon Modifications in the Evolution of the Genetic Code

    NARCIS (Netherlands)

    P.T.S. van der Gulik (Peter); W.D. Hoff (Wouter)

    2011-01-01

    htmlabstractThe origin of the genetic code is a central open problem regarding the early evolution of life. Here, we consider two undeveloped but important aspects of possible scenarios for the evolutionary pathway of the translation machinery: the role of unassigned codons in early stages

  19. Recent development in the ASME O and M committee codes, standards, and guides

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1999-01-01

    The ASME O and M Committee continues to expand and update its code, standards, and guides as contained in the ASME OM Code and the ASME OM Standards/Guides. This paper will describe recent changes to these two ASME documents, including technical inquiries, code cases, and the major reformat of the ASME OM Code 1998 Edition. Also two new Parts to the ASME OM S/G will be discussed: OM Part 23 and OM Part 24, which are close to being initially published. A third new Part to the ASME OM S/G has been authorized and has recently started to get organized: Part 26, 'Thermal Calibration of RTDs'. In addition this paper will describe the future plans for these two documents as provided in the O and M Committee Strategic Plan. (author)

  20. A repository of codes of ethics and technical standards in health informatics.

    Science.gov (United States)

    Samuel, Hamman W; Zaïane, Osmar R

    2014-01-01

    We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository.

  1. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  2. MARS-KS code validation activity through the atlas domestic standard problem

    International Nuclear Information System (INIS)

    Choi, K. Y.; Kim, Y. S.; Kang, K. H.; Park, H. S.; Cho, S.

    2012-01-01

    The 2 nd Domestic Standard Problem (DSP-02) exercise using the ATLAS integral effect test data was executed to transfer the integral effect test data to domestic nuclear industries and to contribute to improving the safety analysis methodology for PWRs. A small break loss of coolant accident of a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. Ten calculation results using MARS-KS code were collected, major prediction results were described qualitatively and code prediction accuracy was assessed quantitatively using the FFTBM. In addition, special code assessment activities were carried out to find out the area where the model improvement is required in the MARS-KS code. The lessons from this DSP-02 and recommendations to code developers are described in this paper. (authors)

  3. PERTV: A standard file version of the PERT-V code

    International Nuclear Information System (INIS)

    George, D.C.; LaBauve, R.J.

    1988-02-01

    The PERT-V code, used in two-dimensional perturbation theory, fast reactor analysis, has been modified to accept input data from standard files ISOTXS, GEODST, ZNATDN, NDXSRF, DLAYXS, RTFLUX, and ATFLUX. This modification has greatly reduced the additional input that must be supplied by the user. The new version of PERT-V, PERTV, has all the options of the original code including a plotting capability. 10 refs., 3 figs., 12 tabs

  4. Contributions of the ORNL piping program to nuclear piping design codes and standards

    International Nuclear Information System (INIS)

    Moore, S.E.

    1975-11-01

    The ORNL Piping Program was conceived and established to develop basic information on the structural behavior of nuclear power plant piping components and to prepare this information in forms suitable for use in design analysis and codes and standards. One of the objectives was to develop and qualify stress indices and flexibility factors for direct use in Code-prescribed design analysis methods. Progress in this area is described

  5. Former Yugoslav Republic of Macedonia; Report on Observance of Standards and Codes: Fiscal Transparency Module

    OpenAIRE

    International Monetary Fund

    2006-01-01

    This report summarizes the Observance of Standards and Codes on Fiscal Transparency for the Former Yugoslav Republic of Macedonia. It provides an assessment of fiscal transparency practices in the Former Yugoslav Republic (FYR) of Macedonia in relation to the requirements of the IMF Code of Good Practices on Fiscal Transparency based on discussions with the authorities and other organizations and through a fiscal transparency questionnaire. It also provides recommendations for improving fisca...

  6. Final Technical Report for GO17004 Regulatory Logic: Codes and Standards for the Hydrogen Economy

    Energy Technology Data Exchange (ETDEWEB)

    Nakarado, Gary L. [Regulatory Logic LLC, Golden, CO (United States)

    2017-02-22

    The objectives of this project are to: develop a robust supporting research and development program to provide critical hydrogen behavior data and a detailed understanding of hydrogen combustion and safety across a range of scenarios, needed to establish setback distances in building codes and minimize the overall data gaps in code development; support and facilitate the completion of technical specifications by the International Organization for Standardization (ISO) for gaseous hydrogen refueling (TS 20012) and standards for on-board liquid (ISO 13985) and gaseous or gaseous blend (ISO 15869) hydrogen storage by 2007; support and facilitate the effort, led by the NFPA, to complete the draft Hydrogen Technologies Code (NFPA 2) by 2008; with experimental data and input from Technology Validation Program element activities, support and facilitate the completion of standards for bulk hydrogen storage (e.g., NFPA 55) by 2008; facilitate the adoption of the most recently available model codes (e.g., from the International Code Council [ICC]) in key regions; complete preliminary research and development on hydrogen release scenarios to support the establishment of setback distances in building codes and provide a sound basis for model code development and adoption; support and facilitate the development of Global Technical Regulations (GTRs) by 2010 for hydrogen vehicle systems under the United Nations Economic Commission for Europe, World Forum for Harmonization of Vehicle Regulations and Working Party on Pollution and Energy Program (ECE-WP29/GRPE); and to Support and facilitate the completion by 2012 of necessary codes and standards needed for the early commercialization and market entry of hydrogen energy technologies.

  7. Analysis of the Current Technical Issues on ASME Code and Standard for Nuclear Mechanical Design(2009)

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2009-11-01

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  8. Report on the Current Technical Issues on ASME Nuclear Code and Standard

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2008-11-01

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  9. Analysis of the Current Technical Issues on ASME Code and Standard for Nuclear Mechanical Design(2009)

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2009-11-15

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  10. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  11. Building America Guidance for Identifying and Overcoming Code, Standard, and Rating Method Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Pamala C.; Halverson, Mark A.

    2013-09-01

    The U.S. Department of Energy’s (DOE) Building America program implemented a new Codes and Standards Innovation (CSI) Team in 2013. The Team’s mission is to assist Building America (BA) research teams and partners in identifying and resolving conflicts between Building America innovations and the various codes and standards that govern the construction of residences. A CSI Roadmap was completed in September, 2013. This guidance document was prepared using the information in the CSI Roadmap to provide BA research teams and partners with specific information and approaches to identifying and overcoming potential barriers to Building America (BA) innovations arising in and/or stemming from codes, standards, and rating methods. For more information on the BA CSI team, please email: CSITeam@pnnl.gov

  12. Overview on pre-harmonization studies conducted by the Working Group on Codes and Standards

    International Nuclear Information System (INIS)

    Guinovart, J.

    1998-01-01

    For more than twenty years, the Working Group on Codes and Standards (WGCS) has been an Advisory Expert Group of the European Commission and three subgroups were formed to consider manufacture and inspection, structural mechanics and materials topics. The WGCS seeks to promote studies at the pre-harmonisation level, for the clarification and building of consensus in the European Community concerning technical issues of relevance for the integrity of safety-related components. It deals with pre-standardization process regarding industrial codes whose rules are applicable to design, construction and operation of NPP components in European Community

  13. Review of application code and standards for mechanical and piping design of HANARO fuel test loop

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.

    1998-02-01

    The design and installation of the irradiation test facility for verification test of the fuel performance are very important in connection with maximization of the utilization of HANARO. HANARO fuel test loop was designed in accordance with the same code and standards of nuclear power plant because HANARO FTL will be operated the high pressure and temperature same as nuclear power plant operation conditions. The objective of this study is to confirm the propriety of application code and standards for mechanical and piping of HANARO fuel test loop and to decide the technical specification of FTL systems. (author). 18 refs., 8 tabs., 6 figs.

  14. Japanese national project for establishment of codes and standards for stationary PEFC system

    International Nuclear Information System (INIS)

    Sumi, S.; Ohmura, T.; Yamaguchi, R.; Kikuzawa, H.

    2003-01-01

    For the purpose of practical utilization of the PEFC cogeneration system, we are promoting the national projects of the 'Establishment of Codes and Standards for Stationary PEFC System'. The objective is to prepare the software platforms for wide spreading use, which are required in the introduction stage of the PEFC cogeneration systems, such as code and standards for safety, reliability, performance and so on. For this objective, using test samples of the systems and the stacks, developments of test and evaluation devices, collection of various kinds of data and establishment of test and evaluation methods are under way. (author)

  15. 3D video coding: an overview of present and upcoming standards

    Science.gov (United States)

    Merkle, Philipp; Müller, Karsten; Wiegand, Thomas

    2010-07-01

    An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.

  16. Evaluation on applicability of the rules, regulations, and industrial codes and standards for SMART development

    International Nuclear Information System (INIS)

    Choi, Suhn; Lee, C C.; Lee, C.K.; Kim, K.K.; Kim, J.P.; Kim, J.H.; Cho, B.H.; Kang, D J.; Bae, G.H.; Chung, M.; Chang, M.H.

    1999-03-01

    In this report, evaluation on applicability of the rules, regulations, and industrial codes and standards for SMART has been made. As the first step, past-to-present status of licensing structures were reviewed. Then, the rules, regulations, and standards applied to YGN 3-6 were listed and reviewed. Finally, evaluation on applicability of such rules and standards for SMART are made in each design fields. During this step technical evaluations on each items of rules, regulations and standards are made and the possible remedies or comments are suggested. The results are summarized in a tabular form and enclosed as Appendix. (Author). 8 refs., 5 tabs., 3 figs

  17. Towards A Genetic Business Code For Growth in the South African Transport Industry

    Directory of Open Access Journals (Sweden)

    J.H. Vermeulen

    2003-11-01

    Full Text Available As with each living organism, it is proposed that an organisation possesses a genetic code. In the fast-changing business environment it would be invaluable to know what constitutes organisational growth and success in terms of such a code. To identify this genetic code a quantitative methodological framework, supplemented by a qualitative approach, was used and the views of top management in the Transport Industry were solicited. The Repertory Grid was used as the primary data-collection method. Through a phased data-analysis process an integrated profile of first- and second-order constructs, and opposite poles, was compiled. By utilising deductive and inductive strategies three strands of a Genetic Business Growth Code were identified, namely a Leadership Strand, Organisational Architecture Strand and Internal Orientation Strand. The study confirmed the value of a Genetic Business Code for growth in the Transport Industry. Opsomming Daar word voorgestel dat ’n organisasie, soos elke lewende organisme, oor ’n genetiese kode beskik. In die snelveranderende sake-omgewing sal dit onskatbaar wees om te weet wat organisasiegroei en –sukses veroorsaak. ’n Kwantitatiewe metodologie-raamwerk, aangevul deur ’n kwalitatiewe benadering is gebruik om hierdie genetiese kode te identifiseer, en die menings van topbestuur in die Vervoerbedryf is ingewin met behulp van die “Repertory Grid" as die vernaamste metode van data-insameling. ’n Geïntegreerde profiel van eerste- en tweedeordekonstrukte, met hulle teenoorgestelde pole, is opgestel. Drie stringe van ’n Genetiese Sakegroeikode, nl. ’n Leierskapstring, die Organisasieargitektuur-string en die Innerlike-ingesteldheidstring is geïdentifiseer deur deduktiewe en induktiewe strategieë te gebruik. Die studie bevestig die waarde van ’n Genetiese Sakekode vir groei in die Vervoerbedryf.

  18. Changing priorities of codes and standards -- quality engineering: Experiences in plant construction, maintenance, and operation

    International Nuclear Information System (INIS)

    Antony, D.D.; Suleski, P.F.; Meier, J.C.

    1994-01-01

    Application of the ASME Code across various fossil and nuclear plants necessitates a Company approach adapted by unique status of each plant. This arises from State Statutes, Federal Regulations and consideration of each plant's as-built history over a broad time frame of design, construction and operation. Additionally, the National Board Inspection Code accompanies Minnesota Statutes for plants owned by Northern States Power Company. This paper addresses some key points on NSP's use of ASME Code as a principal mechanical standard in plant design, construction and operation. A primary resource facilitating review of Code provisions is accurate status on current plant configuration. As plant design changes arise, the Code Edition/Addenda of original construction and installed upgrades or replacements are considered against available options allowed by current standards and dialog with the Jurisdictional Authority. Consistent with the overall goal of safe and reliable plant operation, there are numerous Code details and future needs to be addressed in concert with expected plant economics and planned outages for implementation. The discussion begins in the late 60's with new construction of Monticello and Prairie Island (both nuclear), through Sherburne County Units 1 through 3 (fossil), and their changes, replacements or repairs as operating plants

  19. JAERI thermal reactor standard code system for reactor design and analysis SRAC

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro

    1985-01-01

    SRAC, JAERI thermal reactor standard code system for reactor design and analysis, developed in Japan Atomic Energy Research Institute, is for all types of thermal neutron nuclear design and analysis. The code system has undergone extensive verifications to confirm its functions, and has been used in core modification of the research reactor, detailed design of the multi-purpose high temperature gas reactor and analysis of the experiment with a critical assembly. In nuclear calculation with the code system, multi-group lattice calculation is first made with the libraries. Then, with the resultant homogeneous equivalent group constants, reactor core calculation is made. Described are the following: purpose and development of the code system, functions of the SRAC system, bench mark tests and usage state and future development. (Mori, K.)

  20. End-of-life decisions in Malaysia: Adequacies of ethical codes and developing legal standards.

    Science.gov (United States)

    Kassim, Puteri Nemie Jahn; Alias, Fadhlina

    2015-06-01

    End-of-life decision-making is an area of medical practice in which ethical dilemmas and legal interventions have become increasingly prevalent. Decisions are no longer confined to clinical assessments; rather, they involve wider considerations such as a patient's religious and cultural beliefs, financial constraints, and the wishes and needs of family members. These decisions affect everyone concerned, including members of the community as a whole. Therefore it is imperative that clear ethical codes and legal standards are developed to help guide the medical profession on the best possible course of action for patients. This article considers the relevant ethical, codes and legal provisions in Malaysia governing certain aspects of end-of-life decision-making. It highlights the lack of judicial decisions in this area as well as the limitations with the Malaysian regulatory system. The article recommends the development of comprehensive ethical codes and legal standards to guide end-of-life decision-making in Malaysia.

  1. Up to code: does your company's conduct meet world-class standards?

    Science.gov (United States)

    Paine, Lynn; Deshpandé, Rohit; Margolis, Joshua D; Bettcher, Kim Eric

    2005-12-01

    Codes of conduct have long been a feature of corporate life. Today, they are arguably a legal necessity--at least for public companies with a presence in the United States. But the issue goes beyond U.S. legal and regulatory requirements. Sparked by corruption and excess of various types, dozens of industry, government, investor, and multisector groups worldwide have proposed codes and guidelines to govern corporate behavior. These initiatives reflect an increasingly global debate on the nature of corporate legitimacy. Given the legal, organizational, reputational, and strategic considerations, few companies will want to be without a code. But what should it say? Apart from a handful of essentials spelled out in Sarbanes-Oxley regulations and NYSE rules, authoritative guidance is sorely lacking. In search of some reference points for managers, the authors undertook a systematic analysis of a select group of codes. In this article, they present their findings in the form of a "codex," a reference source on code content. The Global Business Standards Codex contains a set of overarching principles as well as a set of conduct standards for putting those principles into practice. The GBS Codex is not intended to be adopted as is, but is meant to be used as a benchmark by those wishing to create their own world-class code. The provisions of the codex must be customized to a company's specific business and situation; individual companies' codes will include their own distinctive elements as well. What the codex provides is a starting point grounded in ethical fundamentals and aligned with an emerging global consensus on basic standards of corporate behavior.

  2. 75 FR 66725 - National Fire Protection Association (NFPA) Proposes To Revise Codes and Standards

    Science.gov (United States)

    2010-10-29

    ... for Solvent 5/23/2011 Extraction Plants. NFPA 51--2007 Standard for the Design and 11/23/2010... Practice for a 5/23/2011 Field Flame Test for Textiles and Films. NFPA 909--2010 Code for the Protection of...

  3. Building America Guidance for Identifying and Overcoming Code, Standard, and Rating Method Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Cole, P. C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Halverson, M. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    This guidance document was prepared using the input from the meeting summarized in the draft CSI Roadmap to provide Building America research teams and partners with specific information and approaches to identifying and overcoming potential barriers to Building America innovations arising in and/or stemming from codes, standards, and rating methods.

  4. Reliability improvement: where do we go from here. The role of codes and standards

    International Nuclear Information System (INIS)

    Davidson, R.H.

    1976-01-01

    The role of codes and standards in contributing to the future improvement is discussed. The Nuclear Plant Reliability Data System is examined. It is suggested that two systems of the type are needed. One system should focus on component and system reliability while the other should focus on system availability, capacity factor, and fixed outage rate assessment

  5. Adapting Canada's northern infrastructure to climate change: the role of codes and standards

    International Nuclear Information System (INIS)

    Steenhof, P.

    2009-01-01

    This report provides the results of a research project that investigated the use of codes and standards in terms of their potential for fostering adaptation to the future impacts of climate change on built infrastructure in Canada's north. This involved a literature review, undertaking key informant interviews, and a workshop where key stakeholders came together to dialogue on the challenges facing built infrastructure in the north as a result of climate change and the role of codes and standards to help mitigate climate change risk. In this article, attention is given to the topic area of climate data and information requirements related to climate and climate change. This was an important focal area that was identified through this broader research effort since adequate data is essential in allowing codes and standards to meet their ultimate policy objective. A number of priorities have been identified specific to data and information needs in the context of the research topic investigated: There is a need to include northerners in developing the climate and permafrost data required for codes and standards so that these reflect the unique geographical, economic, and cultural realities and variability of the north; Efforts should be undertaken to realign climate design values so that they reflect both present and future risks; There is a need for better information on the rate and extent of permafrost degradation in the north; and, There is a need to improve monitoring of the rate of climate change in the Arctic. (author)

  6. Cape Verde Report on the Observance of Standards and Codes : Accounting and Auditing

    OpenAIRE

    World Bank

    2012-01-01

    This Report on the Observance of Standards and Codes (ROSC) provides an assessment of the strengths and weaknesses of the existing financial reporting infrastructure that underpins financial accounting and auditing practices in Cape Verde. The assessment focuses on six pillars of financial reporting infrastructure: statutory framework, professional education and training, accountancy profe...

  7. Discrete-ordinates electron transport calculations using standard neutron transport codes

    International Nuclear Information System (INIS)

    Morel, J.E.

    1979-01-01

    The primary purpose of this work was to develop a method for using standard neutron transport codes to perform electron transport calculations. The method is to develop approximate electron cross sections which are sufficiently well-behaved to be treated with standard S/sub n/ methods, but which nonetheless yield flux solutions which are very similar to the exact solutions. The main advantage of this approach is that, once the approximate cross sections are constructed, their multigroup Legendre expansion coefficients can be calculated and input to any standard S/sub n/ code. Discrete-ordinates calculations were performed to determine the accuracy of the flux solutions for problems corresponding to 1.0-MeV electrons incident upon slabs of aluminum and gold. All S/sub n/ calculations were compared with similar calculations performed with an electron Monte Carlo code, considered to be exact. In all cases, the discrete-ordinates solutions for integral flux quantities (i.e., scalar flux, energy deposition profiles, etc.) are generally in agreement with the Monte Carlo solutions to within approximately 5% or less. The central conclusion is that integral electron flux quantities can be efficiently and accurately calculated using standard S/sub n/ codes in conjunction with approximate cross sections. Furthermore, if group structures and approximate cross section construction are optimized, accurate differential flux energy spectra may also be obtainable without having to use an inordinately large number of energy groups. 1 figure

  8. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  9. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  10. ASME nuclear codes and standards: Scope of coverage and current initiatives

    International Nuclear Information System (INIS)

    Eisenberg, G. M.

    1995-01-01

    The objective of this paper is to address the broad scope of coverage of nuclear codes, standards and guides produced and administered by the American Society of Mechanical Engineers (ASME). Background information is provided regarding the evolution of the present activities. Details are provided on current initiatives intended to permit ASME to meet the needs of a changing nuclear industry on a worldwide scale. During the early years of commercial nuclear power, ASME produced a code for the construction of nuclear vessels used in the reactor coolant pressure boundary, containment and auxiliary systems. In response to industry growth, ASME Code coverage soon broadened to include rules for construction of other nuclear components, and inservice inspection of nuclear reactor coolant systems. In the years following this, the scope of ASME nuclear codes, standards and guides has been broadened significantly to include air cleaning activities for nuclear power reactors, operation and maintenance of nuclear power plants, quality assurance programs, cranes for nuclear facilities, qualification of mechanical equipment, and concrete reactor vessels and containments. ASME focuses on globalization of its codes, standards and guides by encouraging and promoting their use in the international community and by actively seeking participation of international members on its technical and supervisory committees and in accreditation activities. Details are provided on current international representation. Initiatives are underway to separate the technical requirements from administrative and enforcement requirements, to convert to hard metric units, to provide for non-U. S. materials, and to provide for translations into non-English languages. ASME activity as an accredited ISO 9000 registrar for suppliers of mechanical equipment is described. Rules are being developed for construction of containment systems for nuclear spent fuel and high-level waste transport packagings. Intensive

  11. Harmonization of nuclear codes and standards, pacific nuclear council working and task group report

    International Nuclear Information System (INIS)

    Dua, S.S.

    2006-01-01

    Full text: The codes and standards, both at the national and international level, have had a major impact on the industry worldwide and served it well in maintaining the performance and safety of the nuclear reactors and facilities. The codes and standards, in general, are consensus documents and do seek public input at various levels before they are finalized and rolled out for use by the nuclear vendors, consultants, utilities and regulatory bodies. However, the extensive development of prescriptive national standards if unchecked against the global environment and trade agreements (NAFTA, WTO, etc.) can also become barriers and cause difficulties to compete in the world market. During the last decade, the national and international writing standards writing bodies have recognized these issues and are moving more towards the rationalization and harmonization of their standards with the more widely accepted generic standards. The Pacific Nuclear Council (PNC) recognized the need for harmonization of the nuclear codes and standards for its member countries and formed a Task Group to achieve its objectives. The Task Group has a number of members from the PNC member countries. In 2005 PNC further raised the importance of this activity and formed a Working Group to cover a broader scope. The Working Group (WG) mandate is to identify and analyze the different codes and standards introduced to the Pacific Basin region, in order to achieve mutual understanding, harmonization and application in each country. This o requires the WG to develop and encourage the use of reasonably consistent criteria for the design and development, engineering, procurement, fabrication, construction, testing, operations, maintenance, waste management, decommissioning and the management of the commercial nuclear power plants in the Pacific Basin so as to: Promote consistent safety, quality, environmental and management standards for nuclear energy and other peaceful applications of nuclear

  12. Interdependence, Reflexivity, Fidelity, Impedance Matching, and the Evolution of Genetic Coding

    Science.gov (United States)

    Carter, Charles W; Wills, Peter R

    2018-01-01

    Abstract Genetic coding is generally thought to have required ribozymes whose functions were taken over by polypeptide aminoacyl-tRNA synthetases (aaRS). Two discoveries about aaRS and their interactions with tRNA substrates now furnish a unifying rationale for the opposite conclusion: that the key processes of the Central Dogma of molecular biology emerged simultaneously and naturally from simple origins in a peptide•RNA partnership, eliminating the epistemological utility of a prior RNA world. First, the two aaRS classes likely arose from opposite strands of the same ancestral gene, implying a simple genetic alphabet. The resulting inversion symmetries in aaRS structural biology would have stabilized the initial and subsequent differentiation of coding specificities, rapidly promoting diversity in the proteome. Second, amino acid physical chemistry maps onto tRNA identity elements, establishing reflexive, nanoenvironmental sensing in protein aaRS. Bootstrapping of increasingly detailed coding is thus intrinsic to polypeptide aaRS, but impossible in an RNA world. These notions underline the following concepts that contradict gradual replacement of ribozymal aaRS by polypeptide aaRS: 1) aaRS enzymes must be interdependent; 2) reflexivity intrinsic to polypeptide aaRS production dynamics promotes bootstrapping; 3) takeover of RNA-catalyzed aminoacylation by enzymes will necessarily degrade specificity; and 4) the Central Dogma’s emergence is most probable when replication and translation error rates remain comparable. These characteristics are necessary and sufficient for the essentially de novo emergence of a coupled gene–replicase–translatase system of genetic coding that would have continuously preserved the functional meaning of genetically encoded protein genes whose phylogenetic relationships match those observed today. PMID:29077934

  13. Analysis of genetic code ambiguity arising from nematode-specific misacylated tRNAs.

    Directory of Open Access Journals (Sweden)

    Kiyofumi Hamashima

    Full Text Available The faithful translation of the genetic code requires the highly accurate aminoacylation of transfer RNAs (tRNAs. However, it has been shown that nematode-specific V-arm-containing tRNAs (nev-tRNAs are misacylated with leucine in vitro in a manner that transgresses the genetic code. nev-tRNA(Gly (CCC and nev-tRNA(Ile (UAU, which are the major nev-tRNA isotypes, could theoretically decode the glycine (GGG codon and isoleucine (AUA codon as leucine, causing GGG and AUA codon ambiguity in nematode cells. To test this hypothesis, we investigated the functionality of nev-tRNAs and their impact on the proteome of Caenorhabditis elegans. Analysis of the nucleotide sequences in the 3' end regions of the nev-tRNAs showed that they had matured correctly, with the addition of CCA, which is a crucial posttranscriptional modification required for tRNA aminoacylation. The nuclear export of nev-tRNAs was confirmed with an analysis of their subcellular localization. These results show that nev-tRNAs are processed to their mature forms like common tRNAs and are available for translation. However, a whole-cell proteome analysis found no detectable level of nev-tRNA-induced mistranslation in C. elegans cells, suggesting that the genetic code is not ambiguous, at least under normal growth conditions. Our findings indicate that the translational fidelity of the nematode genetic code is strictly maintained, contrary to our expectations, although deviant tRNAs with misacylation properties are highly conserved in the nematode genome.

  14. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  15. Design validation of the ITER EC upper launcher according to codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Spaeh, Peter, E-mail: peter.spaeh@kit.edu [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Aiello, Gaetano [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Gagliardi, Mario [Karlsruhe Institute of Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); F4E, Fusion for Energy, Joint Undertaking, Barcelona (Spain); Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Weinhorst, Bastian [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany)

    2015-10-15

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  16. Design validation of the ITER EC upper launcher according to codes and standards

    International Nuclear Information System (INIS)

    Spaeh, Peter; Aiello, Gaetano; Gagliardi, Mario; Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro; Weinhorst, Bastian

    2015-01-01

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  17. Harmonization of nuclear codes and standards. Pacific nuclear council working and task group report

    International Nuclear Information System (INIS)

    Dua, Shami

    2008-01-01

    Nuclear codes and standards have been an integral part of the nuclear industry since its inception. As the industry came into the main stream over the 2nd half of the 20th century, a number of national and international standards were developed to support a specific nuclear reactor concept. These codes and standards have been a key component of the industry to maintain its focus on nuclear safety, reliability and quality. Both national and international standards have served the industry well in obtaining public, shareholder and regulatory acceptance. The existing suite of national and international standards is required to support the emerging nuclear renaissance. However as noted above under Pacific Nuclear Council (PNC), Manufacturing Design Evaluation Program (MDEP) and SMiRT discussions, the time has come now for the codes and standards writing bodies and the industry to take the next step to examine the relevance of existing suite in view of current needs and challenges. This review must account for the changing global environment including global supply chain and regulatory framework, resources, deregulation, free trade, and industry need for competitiveness and performance excellence. The Task Group (TG) has made limited progress in this review period as no additional information on the listing of codes and standards have been received from the members. However, TG Chair has been successful in obtaining considerable interest from some additional individuals from the member countries. It is important that PNC management seek additional participation from the member countries and asks for their active engagement in the Working Group (WG) TG activities to achieve its mandate and deliverables. The harmonization of codes and standards is a key area for the emerging nuclear renaissance and as noted by a number of international organizations (refer to MDEP action noted above) that these tasks can not be completed unless we have the right level of resources and

  18. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  19. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  20. Simulation of International Standard Problem No. 44 'KAEVER' experiments on aerosol behaviour with the CONTAIN code

    International Nuclear Information System (INIS)

    Kljenak, I.

    2001-01-01

    Experiments on aerosol behavior in a vapor-saturated atmosphere, which were performed in the KAEVER experimental facility and proposed for the OECD International Standard Problem No. 44, were simulated with the CONTAIN thermal-hydraulic computer code. The purpose of the work was to assess the capability of the CONTAIN code to model aerosol condensation and deposition in a containment of a light-water-reactor nuclear power plant at severe accident conditions. Results of dry and wet aerosol concentrations are presented and analyzed.(author)

  1. The applicability of ALPHA/PHOENIX/ANC nuclear design code system on Korean standard PWR's

    International Nuclear Information System (INIS)

    Lee, Kookjong; Choi, Kie-Yong; Lee, Hae-Chan; Roh, Eun-Rae

    1996-01-01

    For the Korean Standard Nuclear Power Plant (KSNPP) designed based on Combustion Engineering (CE) System 80, the Westinghouse nuclear design code system ALPHA/PHOENIX/ANC was applied to the follow-up design of initial and reload core of KSNPP. The follow-up design results of Yonggwang Unit 3 Cycle 1, 2 and Yonggwang Unit 4 Cycle 1 have shown good agreements with the measured data. The assemblywise power distributions have shown less than 2% average differences and critical boron concentrations have shown less than 20 ppm differences. All the low power physics test parameters are in good agreement. Consequently, APA design code system can be applied to KNSPP cores. (author)

  2. Inclusion of the fitness sharing technique in an evolutionary algorithm to analyze the fitness landscape of the genetic code adaptability.

    Science.gov (United States)

    Santos, José; Monteagudo, Ángel

    2017-03-27

    The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.

  3. A novel nuclear genetic code alteration in yeasts and the evolution of codon reassignment in eukaryotes.

    Science.gov (United States)

    Mühlhausen, Stefanie; Findeisen, Peggy; Plessmann, Uwe; Urlaub, Henning; Kollmar, Martin

    2016-07-01

    The genetic code is the cellular translation table for the conversion of nucleotide sequences into amino acid sequences. Changes to the meaning of sense codons would introduce errors into almost every translated message and are expected to be highly detrimental. However, reassignment of single or multiple codons in mitochondria and nuclear genomes, although extremely rare, demonstrates that the code can evolve. Several models for the mechanism of alteration of nuclear genetic codes have been proposed (including "codon capture," "genome streamlining," and "ambiguous intermediate" theories), but with little resolution. Here, we report a novel sense codon reassignment in Pachysolen tannophilus, a yeast related to the Pichiaceae. By generating proteomics data and using tRNA sequence comparisons, we show that Pachysolen translates CUG codons as alanine and not as the more usual leucine. The Pachysolen tRNACAG is an anticodon-mutated tRNA(Ala) containing all major alanine tRNA recognition sites. The polyphyly of the CUG-decoding tRNAs in yeasts is best explained by a tRNA loss driven codon reassignment mechanism. Loss of the CUG-tRNA in the ancient yeast is followed by gradual decrease of respective codons and subsequent codon capture by tRNAs whose anticodon is not part of the aminoacyl-tRNA synthetase recognition region. Our hypothesis applies to all nuclear genetic code alterations and provides several testable predictions. We anticipate more codon reassignments to be uncovered in existing and upcoming genome projects. © 2016 Mühlhausen et al.; Published by Cold Spring Harbor Laboratory Press.

  4. Verification of thermal-hydraulic computer codes against standard problems for WWER reflooding

    International Nuclear Information System (INIS)

    Alexander D Efanov; Vladimir N Vinogradov; Victor V Sergeev; Oleg A Sudnitsyn

    2005-01-01

    Full text of publication follows: The computational assessment of reactor core components behavior under accident conditions is impossible without knowledge of the thermal-hydraulic processes occurring in this case. The adequacy of the results obtained using the computer codes to the real processes is verified by carrying out a number of standard problems. In 2000-2003, the fulfillment of three Russian standard problems on WWER core reflooding was arranged using the experiments on full-height electrically heated WWER 37-rod bundle model cooldown in regimes of bottom (SP-1), top (SP-2) and combined (SP-3) reflooding. The representatives from the eight MINATOM's organizations took part in this work, in the course of which the 'blind' and posttest calculations were performed using various versions of the RELAP5, ATHLET, CATHARE, COBRA-TF, TRAP, KORSAR computer codes. The paper presents a brief description of the test facility, test section, test scenarios and conditions as well as the basic results of computational analysis of the experiments. The analysis of the test data revealed a significantly non-one-dimensional nature of cooldown and rewetting of heater rods heated up to a high temperature in a model bundle. This was most pronounced at top and combined reflooding. The verification of the model reflooding computer codes showed that most of computer codes fairly predict the peak rod temperature and the time of bundle cooldown. The exception is provided by the results of calculations with the ATHLET and CATHARE codes. The nature and rate of rewetting front advance in the lower half of the bundle are fairly predicted practically by all computer codes. The disagreement between the calculations and experimental results for the upper half of the bundle is caused by the difficulties of computational simulation of multidimensional effects by 1-D computer codes. In this regard, a quasi-two-dimensional computer code COBRA-TF offers certain advantages. Overall, the closest

  5. High-Penetration Photovoltaics Standards and Codes Workshop, Denver, Colorado, May 20, 2010: Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Coddington, M.; Kroposki, B.; Basso, T.; Lynn, K.; Herig, C.; Bower, W.

    2010-09-01

    Effectively interconnecting high-level penetration of photovoltaic (PV) systems requires careful technical attention to ensuring compatibility with electric power systems. Standards, codes, and implementation have been cited as major impediments to widespread use of PV within electric power systems. On May 20, 2010, in Denver, Colorado, the National Renewable Energy Laboratory, in conjunction with the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), held a workshop to examine the key technical issues and barriers associated with high PV penetration levels with an emphasis on codes and standards. This workshop included building upon results of the High Penetration of Photovoltaic (PV) Systems into the Distribution Grid workshop held in Ontario California on February 24-25, 2009, and upon the stimulating presentations of the diverse stakeholder presentations.

  6. JJ1017 committee report: image examination order codes--standardized codes for imaging modality, region, and direction with local expansion: an extension of DICOM.

    Science.gov (United States)

    Kimura, Michio; Kuranishi, Makoto; Sukenobu, Yoshiharu; Watanabe, Hiroki; Tani, Shigeki; Sakusabe, Takaya; Nakajima, Takashi; Morimura, Shinya; Kabata, Shun

    2002-06-01

    The digital imaging and communications in medicine (DICOM) standard includes parts regarding nonimage data information, such as image study ordering data and performed procedure data, and is used for sharing information between HIS/RIS and modality systems, which is essential for IHE. To bring such parts of the DICOM standard into force in Japan, a joint committee of JIRA and JAHIS established the JJ1017 management guideline, specifying, for example, which items are legally required in Japan, while remaining optional in the DICOM standard. In Japan, the contents of orders from referring physicians for radiographic examinations include details of the examination. Such details are not used typically by referring physicians requesting radiographic examinations in the United States, because radiologists in the United States often determine the examination protocol. The DICOM standard has code tables for examination type, region, and direction for image examination orders. However, this investigation found that it does not include items that are detailed sufficiently for use in Japan, because of the above-mentioned reason. To overcome these drawbacks, we have generated the JJ1017 code for these 3 codes for use based on the JJ1017 guidelines. This report introduces the JJ1017 code. These codes (the study type codes in particular) must be expandable to keep up with technical advances in equipment. Expansion has 2 directions: width for covering more categories and depth for specifying the information in more detail (finer categories). The JJ1017 code takes these requirements into consideration and clearly distinguishes between the stem part as the common term and the expansion. The stem part of the JJ1017 code partially utilizes the DICOM codes to remain in line with the DICOM standard. This work is an example of how local requirements can be met by using the DICOM standard and extending it.

  7. Regulations, Codes, and Standards (RCS) Template for California Hydrogen Dispensing Stations

    Energy Technology Data Exchange (ETDEWEB)

    Rivkin, C.; Blake, C.; Burgess, R.; Buttner, W.; Post, M.

    2012-11-01

    This report explains the Regulations, Codes, and Standards (RCS) requirements for hydrogen dispensing stations in the State of California. The reports shows the basic components of a hydrogen dispensing station in a simple schematic drawing; the permits and approvals that would typically be required for the construction and operation of a hydrogen dispensing station; and a basic permit that might be employed by an Authority Having Jurisdiction (AHJ).

  8. International standards: the World Organisation for Animal Health Terrestrial Animal Health Code.

    Science.gov (United States)

    Thiermann, A B

    2015-04-01

    This paper provides a description of the international standards contained in the TerrestrialAnimal Health Code of the World Organisation for Animal Health (OIE) that relate to the prevention and control of vector-borne diseases. It identifies the rights and obligations of OIE Member Countries regarding the notification of animal disease occurrences, as well as the recommendations to be followed for a safe and efficient international trade of animals and their products.

  9. Non-standard model for electron heat transport for multidimensional hydrodynamic codes

    Energy Technology Data Exchange (ETDEWEB)

    Nicolai, Ph.; Busquet, M.; Schurtz, G. [CEA/DAM-Ile de France, 91 - Bruyeres Le Chatel (France)

    2000-07-01

    In simulations of laser-produced plasma, modeling of heat transport requires an artificial limitation of standard Spitzer-Haerm fluxes. To improve heat conduction processing, we have developed a multidimensional model which accounts for non-local features of heat transport and effects of self-generated magnetic fields. This consistent treatment of both mechanisms has been implemented in a two-dimensional radiation-hydrodynamic code. First results indicate good agreements between simulations and experimental data. (authors)

  10. Non-standard model for electron heat transport for multidimensional hydrodynamic codes

    International Nuclear Information System (INIS)

    Nicolai, Ph.; Busquet, M.; Schurtz, G.

    2000-01-01

    In simulations of laser-produced plasma, modeling of heat transport requires an artificial limitation of standard Spitzer-Haerm fluxes. To improve heat conduction processing, we have developed a multidimensional model which accounts for non-local features of heat transport and effects of self-generated magnetic fields. This consistent treatment of both mechanisms has been implemented in a two-dimensional radiation-hydrodynamic code. First results indicate good agreements between simulations and experimental data. (authors)

  11. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    OpenAIRE

    Cevdet Kızıl; Ayşe Tansel Çetin; Ahmed Bulunmaz

    2014-01-01

    The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commerc...

  12. Integrating industry nuclear codes and standards into United States Department of Energy facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jacox, J.

    1995-02-01

    Recently the United States Department of Energy (DOE) has mandated facilities under their jurisdiction use various industry Codes and Standards developed for civilian power reactors that operate under U.S. Nuclear Regulatory Commission License. While this is a major step forward in putting all our nuclear facilities under common technical standards there are always problems associated with implementing such advances. This paper will discuss some of the advantages and problems experienced to date. These include the universal challenge of educating new users of any technical documents, repeating errors made by the NRC licensed facilities over the years and some unique problems specific to DOE facilities.

  13. Validation of the ASSERT subchannel code for prediction of CHF in standard and non-standard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Kiteley, J.C.; Carver, M.B.; Zhou, Q.N.

    1993-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of CANDU PHWR fuel channels, and to provide a detailed prediction of critical heat flux distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting critical heat flux (CHF) at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental data base. In particular, ASSERT is the only tool available to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. This paper discusses the numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology. The evolutionary validation plan is discussed, and early validation exercises are summarized. The paper concentrates, however, on more recent validation exercises in standard and non-standard geometries. 28 refs., 12 figs

  14. Validation of the assert subchannel code: Prediction of CHF in standard and non-standard Candu bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1993-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of CANDU PHWR fuel channels, and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of prediting CHF at these local conditions, makes it a unique tool for predicting CHF for situations outside the existing experimental data base. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. This paper discusses the numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology. The evolutionary validation plan is discussed, and early validation exercises are summarized. The paper concentrates, however, on more recent validation exercises in standard and non-standard geometries

  15. Advanced Design of Dumbbell-shaped Genetic Minimal Vectors Improves Non-coding and Coding RNA Expression.

    Science.gov (United States)

    Jiang, Xiaoou; Yu, Han; Teo, Cui Rong; Tan, Genim Siu Xian; Goh, Sok Chin; Patel, Parasvi; Chua, Yiqiang Kevin; Hameed, Nasirah Banu Sahul; Bertoletti, Antonio; Patzel, Volker

    2016-09-01

    Dumbbell-shaped DNA minimal vectors lacking nontherapeutic genes and bacterial sequences are considered a stable, safe alternative to viral, nonviral, and naked plasmid-based gene-transfer systems. We investigated novel molecular features of dumbbell vectors aiming to reduce vector size and to improve the expression of noncoding or coding RNA. We minimized small hairpin RNA (shRNA) or microRNA (miRNA) expressing dumbbell vectors in size down to 130 bp generating the smallest genetic expression vectors reported. This was achieved by using a minimal H1 promoter with integrated transcriptional terminator transcribing the RNA hairpin structure around the dumbbell loop. Such vectors were generated with high conversion yields using a novel protocol. Minimized shRNA-expressing dumbbells showed accelerated kinetics of delivery and transcription leading to enhanced gene silencing in human tissue culture cells. In primary human T cells, minimized miRNA-expressing dumbbells revealed higher stability and triggered stronger target gene suppression as compared with plasmids and miRNA mimics. Dumbbell-driven gene expression was enhanced up to 56- or 160-fold by implementation of an intron and the SV40 enhancer compared with control dumbbells or plasmids. Advanced dumbbell vectors may represent one option to close the gap between durable expression that is achievable with integrating viral vectors and short-term effects triggered by naked RNA.

  16. ENDF-UTILITY-CODES, codes to check and standardize data in the Evaluated Nuclear Data File (ENDF)

    International Nuclear Information System (INIS)

    Dunford, Charles L.

    2007-01-01

    1 - Description of program or function: The ENDF Utility Codes include 9 codes to check and standardize data in the Evaluated Nuclear Data File (ENDF). Four programs of this release, GETMAT, LISTEF, PLOTEF and SETMDC are no more maintained since release 6.13. The suite of ENDF utility codes includes: - CHECKR (version 7.01) is a program for checking that an evaluated data file conforms to the ENDF format. - FIZCON (version 7.02) is a program for checking that an evaluated data file has valid data and conforms to recommended procedures. - GETMAT (version 6.13) is designed to retrieve one or more materials from an ENDF formatted data file. The output will contain only the selected materials. - INTER (version 7.01) calculates thermal cross sections, g-factors, resonance integrals, fission spectrum averaged cross sections and 14.0 MeV (or other energy) cross sections for major reactions in an ENDF-6 or ENDF-5 format data file. - LISTEF (version 6.13) is designed to produce summary and annotated listings of a data file in either ENDF-6 or ENDF-5 format. - PLOTEF (version 6.13) is designed to produce graphical displays of a data file in either ENDF-5 or ENDF-6 format. The form of graphical output depends on the graphical devices available at the installation where this code will be used. - PSYCHE (version 7.02) is a program for checking the physics content of an evaluated data file. It can recognise the difference between ENDF-5 or ENDF-6 formats and performs its tests accordingly. - SETMDC (version 6.13) is a utility program that converts the source decks of programs to different computers (DOS, UNIX, LINUX, VMS, Windows). - STANEF (version 7.01) performs bookkeeping operations on a data file containing one or more material evaluations in ENDF format. The version 7.02 of the ENDF Utility Codes corrects all bugs reported to NNDC as of April 1, 2005 and supersedes all previous releases. Three codes CHECKR, STANEF, and INTER were actually ported from the 7.01 release

  17. Comparison of the General Electric BWR/6 standard plant design to the IAEA NUSS codes and guides

    International Nuclear Information System (INIS)

    D'Ardenne, W.H.; Sherwood, G.G.

    1985-01-01

    The General Electric BWR/6 Mark III standard plant design meets or exceeds current requirements of published International Atomic Energy Agency (IAEA) Nuclear Safety Standards (NUSS) codes and guides. This conclusion is based on a review of the NUSS codes and guides by General Electric and by the co-ordinated US review of the NUSS codes and guides during their development. General Electric compared the published IAEA NUSS codes and guides with the General Electric design. The applicability of each code and guide to the BWR/6 Mark III standard plant design was determined. Each code or guide was reviewed by a General Electric engineer knowledgeable about the structures, systems and components addressed and the technical area covered by that code or guide. The results of this review show that the BWR/6 Mark III standard plant design meets or exceeds the applicable requirements of the published IAEA NUSS codes and guides. The co-ordinated US review of the IAEA NUSS codes and guides corroborates the General Electric review. In the co-ordinated US review, the USNRC and US industry organizations (including General Electric) review the NUSS codes and guides during their development. This review ensures that the NUSS codes and guides are consistent with the current US government regulations, guidance and regulatory practices, US voluntary industry codes and standards, and accepted US industry design, construction and operational practices. If any inconsistencies are identified, comments are submitted to the IAEA by the USNRC. All US concerns submitted to the IAEA have been resolved. General Electric design reviews and the Final Design Approval (FDA) issued by the USNRC have verified that the General Electric BWR/6 Mark III standard plant design meets or exceeds the current US requirements, guidance and practices. Since these requirements, guidance and practices meet or exceed those of the NUSS codes and guides, so does the General Electric design. (author)

  18. An Order Coding Genetic Algorithm to Optimize Fuel Reloads in a Nuclear Boiling Water Reactor

    International Nuclear Information System (INIS)

    Ortiz, Juan Jose; Requena, Ignacio

    2004-01-01

    A genetic algorithm is used to optimize the nuclear fuel reload for a boiling water reactor, and an order coding is proposed for the chromosomes and appropriate crossover and mutation operators. The fitness function was designed so that the genetic algorithm creates fuel reloads that, on one hand, satisfy the constrictions for the radial power peaking factor, the minimum critical power ratio, and the maximum linear heat generation rate while optimizing the effective multiplication factor at the beginning and end of the cycle. To find the values of these variables, a neural network trained with the behavior of a reactor simulator was used to predict them. The computation time is therefore greatly decreased in the search process. We validated this method with data from five cycles of the Laguna Verde Nuclear Power Plant in Mexico

  19. SRAC: JAERI thermal reactor standard code system for reactor design and analysis

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Takano, Hideki; Horikami, Kunihiko; Ishiguro, Yukio; Kaneko, Kunio; Hara, Toshiharu.

    1983-01-01

    The SRAC (Standard Reactor Analysis Code) is a code system for nuclear reactor analysis and design. It is composed of neutron cross section libraries and auxiliary processing codes, neutron spectrum routines, a variety of transport, 1-, 2- and 3-D diffusion routines, dynamic parameters and cell burn-up routines. By making the best use of the individual code function in the SRAC system, the user can select either the exact method for an accurate estimate of reactor characteristics or the economical method aiming at a shorter computer time, depending on the purpose of study. The user can select cell or core calculation; fixed source or eigenvalue problem; transport (collision probability or Sn) theory or diffusion theory. Moreover, smearing and collapsing of macroscopic cross sections are separately done by the user's selection. And a special attention is paid for double heterogeneity. Various techniques are employed to access the data storage and to optimize the internal data transfer. Benchmark calculations using the SRAC system have been made extensively for the Keff values of various types of critical assemblies (light water, heavy water and graphite moderated systems, and fast reactor systems). The calculated results show good prediction for the experimental Keff values. (author)

  20. TANDA TANGAN DIGITAL MENGGUNAKAN QR CODE DENGAN METODE ADVANCED ENCRYPTION STANDARD

    Directory of Open Access Journals (Sweden)

    Abdul Gani Putra Suratma

    2017-04-01

    Full Text Available Tanda tangan digital (digital signature adalah sebuah skema matematis yang secara unik mengidentifikasikan seorang pengirim, sekaligus untuk membuktikan keaslian dari pemilik sebuah pesan atau dokumen digital, sehingga sebuah tanda tangan digital yang autentik (sah, sudah cukup menjadi alasan bagi penerima un- tuk percaya bahwa sebuah pesan atau dokumen yang diterima adalah berasal dari pengirim yang telah diketahui. Perkembangan teknologi memungkinkan adanya tanda tangan digital yang dapat digunakan untuk melakukan pembuktian secara matematis, sehingga informasi yang didapat oleh satu pihak dari pihak lain dapat diidentifikasi untuk memastikan keaslian informasi yang diterima. Tanda tangan digital merupakan mekanisme otentikasi yang memungkinkan pembuat pesan menambahkan sebuah kode yang bertindak sebagai tanda tangannya. Tujuan dari penelitian ini menerapkan QR Code atau yang dikenal dengan istilah QR (Quick Respon dan Algoritma yang akan ditambahkan yaitu AES (Advanced Encryption Standard sebagai tanda tangan digital sehingga hasil dari penelitian penerapan QR Code menggunakan algoritma Advanced Encryption Standard sebagai tanda tangan digital dapat berfungsi sebagai otentikasi tanda tangan pimpinan serta ve- rivikasi dokumen pengambilan barang yang sah. dari penelitian ini akurasi klasifi- kasi QR Code dengan menggunakan naïve bayes classifier sebesar 90% dengan precision positif sebesar 80% dan precision negatif sebesar 100%.

  1. International pressure vessels and piping codes and standards. Volume 2: Current perspectives; PVP-Volume 313-2

    International Nuclear Information System (INIS)

    Rao, K.R.; Asada, Yasuhide; Adams, T.M.

    1995-01-01

    The topics in this volume include: (1) Recent or imminent changes to Section 3 design sections; (2) Select perspectives of ASME Codes -- Section 3; (3) Select perspectives of Boiler and Pressure Vessel Codes -- an international outlook; (4) Select perspectives of Boiler and Pressure Vessel Codes -- ASME Code Sections 3, 8 and 11; (5) Codes and Standards Perspectives for Analysis; (6) Selected design perspectives on flow-accelerated corrosion and pressure vessel design and qualification; (7) Select Codes and Standards perspectives for design and operability; (8) Codes and Standards perspectives for operability; (9) What's new in the ASME Boiler and Pressure Vessel Code?; (10) A look at ongoing activities of ASME Sections 2 and 3; (11) A look at current activities of ASME Section 11; (12) A look at current activities of ASME Codes and Standards; (13) Simplified design methodology and design allowable stresses -- 1 and 2; (14) Introduction to Power Boilers, Section 1 of the ASME Code -- Part 1 and 2. Separate abstracts were prepared for most of the individual papers

  2. International symposium on standards and codes of practice in medical radiation dosimetry. Book of extended synopses

    International Nuclear Information System (INIS)

    2002-01-01

    The development of radiation measurement standards by National Metrology Institutes (NMIs) and their dissemination to Secondary Standard Dosimetry Laboratories (SSDLs), cancer therapy centres and hospitals represent essential aspects of the radiation dosimetry measurement chain. Although the demands for accuracy in radiotherapy initiated the establishment of such measurement chains, similar traceable dosimetry procedures have been implemented, or are being developed, in other areas of radiation medicine (e.g. diagnostic radiology and nuclear medicine), in radiation protection and in industrial applications of radiation. In the past few years the development of primary standards of absorbed dose to water in 60 Co for radiotherapy dosimetry has made direct calibrations in terms of absorbed dose to water available in many countries for the first time. Some laboratories have extended the development of these standards to high energy photon and electron beams and to low and medium energy x-ray beams. Other countries, however, still base their dosimetry for radiotherapy on air kerma standards. Dosimetry for conventional external beam radiotherapy was probably the field where standardized procedures adopted by medical physicists at hospitals were developed first. Those were related to exposure and air kerma standards. The recent development of Codes of Practice (or protocols) based on the concept of absorbed dose to water has led to changes in calibration procedures at hospitals. The International Code of Practice for Dosimetry Based on Standards of Absorbed Dose to Water (TRS 398) was sponsored by the International Atomic Energy Agency (IAEA), World Health Organization (WHO), Pan-American Health Organization (PAHO) and the European Society for Therapeutic Radiology and Oncology (ESTRO) and is expected to be adopted in many countries worldwide. It provides recommendations for the dosimetry of all types of beams (except neutrons) used in external radiotherapy and satisfies

  3. International symposium on standards and codes of practice in medical radiation dosimetry. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    The development of radiation measurement standards by National Metrology Institutes (NMIs) and their dissemination to Secondary Standard Dosimetry Laboratories (SSDLs), cancer therapy centres and hospitals represent essential aspects of the radiation dosimetry measurement chain. Although the demands for accuracy in radiotherapy initiated the establishment of such measurement chains, similar traceable dosimetry procedures have been implemented, or are being developed, in other areas of radiation medicine (e.g. diagnostic radiology and nuclear medicine), in radiation protection and in industrial applications of radiation. In the past few years the development of primary standards of absorbed dose to water in {sup 60}Co for radiotherapy dosimetry has made direct calibrations in terms of absorbed dose to water available in many countries for the first time. Some laboratories have extended the development of these standards to high energy photon and electron beams and to low and medium energy x-ray beams. Other countries, however, still base their dosimetry for radiotherapy on air kerma standards. Dosimetry for conventional external beam radiotherapy was probably the field where standardized procedures adopted by medical physicists at hospitals were developed first. Those were related to exposure and air kerma standards. The recent development of Codes of Practice (or protocols) based on the concept of absorbed dose to water has led to changes in calibration procedures at hospitals. The International Code of Practice for Dosimetry Based on Standards of Absorbed Dose to Water (TRS 398) was sponsored by the International Atomic Energy Agency (IAEA), World Health Organization (WHO), Pan-American Health Organization (PAHO) and the European Society for Therapeutic Radiology and Oncology (ESTRO) and is expected to be adopted in many countries worldwide. It provides recommendations for the dosimetry of all types of beams (except neutrons) used in external radiotherapy and

  4. Simulation of international standard problem no. 44 open tests using Melcor computer code

    International Nuclear Information System (INIS)

    Song, Y.M.; Cho, S.W.

    2001-01-01

    MELCOR 1.8.4 code has been employed to simulate the KAEVER test series of K123/K148/K186/K188 that were proposed as open experiments of International Standard Problem No.44 by OECD-CSNI. The main purpose of this study is to evaluate the accuracy of the MELCOR aerosol model which calculates the aerosol distribution and settlement in a containment. For this, thermal hydraulic conditions are simulated first for the whole test period and then the behavior of hygroscopic CsOH/CsI and unsoluble Ag aerosols, which are predominant activity carriers in a release into the containment, is compared between the experimental results and the code predictions. The calculation results of vessel atmospheric concentration show a good simulation for dry aerosol but show large difference for wet aerosol due to a data mismatch in vessel humidity and the hygroscopicity. (authors)

  5. ISO 639-1 and ISO 639-2: International Standards for Language Codes. ISO 15924: International Standard for Names of Scripts.

    Science.gov (United States)

    Byrum, John D.

    This paper describes two international standards for the representation of the names of languages. The first (ISO 639-1), published in 1988, provides two-letter codes for 136 languages and was produced primarily to meet terminological needs. The second (ISO 639-2) appeared in late 1998 and includes three-letter codes for 460 languages. This list…

  6. Review and evaluation of technology, equipment, codes and standards for digitization of industrial radiographic film

    International Nuclear Information System (INIS)

    1992-05-01

    This reports contains a review and evaluation of the technology, equipment, and codes and standards related to the digitization of industrial radiographic film. The report presents recommendations and equipment-performance specifications that will allow the digitization of radiographic film from nuclear power plant components in order to produce faithful reproductions of flaw images of interest on the films. Justification for the specifications selected are provided. Performance demonstration tests for the digitization process are required and criteria for such tests is presented. Also several comments related to implementation of the technology are presented and discussed

  7. Mapping the Plasticity of the E. coli Genetic Code with Orthogonal Pair Directed Sense Codon Reassignment.

    Science.gov (United States)

    Schmitt, Margaret A; Biddle, Wil; Fisk, John Domenic

    2018-04-18

    The relative quantitative importance of the factors that determine the fidelity of translation is largely unknown, which makes predicting the extent to which the degeneracy of the genetic code can be broken challenging. Our strategy of using orthogonal tRNA/aminoacyl tRNA synthetase pairs to precisely direct the incorporation of a single amino acid in response to individual sense and nonsense codons provides a suite of related data with which to examine the plasticity of the code. Each directed sense codon reassignment measurement is an in vivo competition experiment between the introduced orthogonal translation machinery and the natural machinery in E. coli. This report discusses 20 new, related genetic codes, in which a targeted E. coli wobble codon is reassigned to tyrosine utilizing the orthogonal tyrosine tRNA/aminoacyl tRNA synthetase pair from Methanocaldococcus jannaschii. One at a time, reassignment of each targeted sense codon to tyrosine is quantified in cells by measuring the fluorescence of GFP variants in which the essential tyrosine residue is encoded by a non-tyrosine codon. Significantly, every wobble codon analyzed may be partially reassigned with efficiencies ranging from 0.8% to 41%. The accumulation of the suite of data enables a qualitative dissection of the relative importance of the factors affecting the fidelity of translation. While some correlation was observed between sense codon reassignment and either competing endogenous tRNA abundance or changes in aminoacylation efficiency of the altered orthogonal system, no single factor appears to predominately drive translational fidelity. Evaluation of relative cellular fitness in each of the 20 quantitatively-characterized proteome-wide tyrosine substitution systems suggests that at a systems level, E. coli is robust to missense mutations.

  8. A nuclear reload optimization approach using a real coded genetic algorithm with random keys

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    The fuel reload of a Pressurized Water Reactor is made whenever the burn up of the fuel assemblies in the nucleus of the reactor reaches a certain value such that it is not more possible to maintain a critical reactor producing energy at nominal power. The problem of fuel reload optimization consists on determining the positioning of the fuel assemblies within the nucleus of the reactor in an optimized way to minimize the cost benefit relationship of fuel assemblies cost per maximum burn up, and also satisfying symmetry and safety restrictions. The fuel reload optimization problem difficulty grows exponentially with the number of fuel assemblies in the nucleus of the reactor. During decades the fuel reload optimization problem was solved manually by experts that used their knowledge and experience to build configurations of the reactor nucleus, and testing them to verify if safety restrictions of the plant are satisfied. To reduce this burden, several optimization techniques have been used, included the binary code genetic algorithm. In this work we show the use of a real valued coded approach of the genetic algorithm, with different recombination methods, together with a transformation mechanism called random keys, to transform the real values of the genes of each chromosome in a combination of discrete fuel assemblies for evaluation of the reload optimization. Four different recombination methods were tested: discrete recombination, intermediate recombination, linear recombination and extended linear recombination. For each of the 4 recombination methods 10 different tests using different seeds for the random number generator were conducted 10 generating, totaling 40 tests. The results of the application of the genetic algorithm are shown with formulation of real numbers for the problem of the nuclear reload of the plant Angra 1 type PWR. Since the best results in the literature for this problem were found by the parallel PSO we will it use for comparison

  9. Error detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3

    Science.gov (United States)

    Fujiwara, Toru; Kasami, Tadao; Lin, Shu

    1989-09-01

    The error-detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3 are investigated. These codes are also used for error detection in the data link layer of the Ethernet, a local area network. The weight distributions for various code lengths are calculated to obtain the probability of undetectable error and that of detectable error for a binary symmetric channel with bit-error rate between 0.00001 and 1/2.

  10. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  11. HCPB TBM thermo mechanical design: Assessment with respect codes and standards and DEMO relevancy

    International Nuclear Information System (INIS)

    Cismondi, F.; Kecskes, S.; Aiello, G.

    2011-01-01

    In the frame of the activities of the European TBM Consortium of Associates the Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) is developed in Karlsruhe Institute of Technology (KIT). After performing detailed thermal and fluid dynamic analyses of the preliminary HCPB TBM design, the thermo mechanical behaviour of the TBM under typical ITER loads has to be assessed. A synthesis of the different design options proposed has been realized building two different assemblies of the HCPB-TBM: these two assemblies and the analyses performed on them are presented in this paper. Finite Element thermo-mechanical analyses of two detailed 1/4 scaled models of the HCPB-TBM assemblies proposed have been performed, with the aim of verifying the accordance of the mechanical behaviour with the criteria of the design codes and standards. The structural design limits specified in the codes and standard are discussed in relation with the EUROFER available data and possible damage modes. Solutions to improve the weak structural points of the present design are identified and the DEMO relevancy of the present thermal and structural design parameters is discussed.

  12. Genetic coding and united-hypercomplex systems in the models of algebraic biology.

    Science.gov (United States)

    Petoukhov, Sergey V

    2017-08-01

    Structured alphabets of DNA and RNA in their matrix form of representations are connected with Walsh functions and a new type of systems of multidimensional numbers. This type generalizes systems of complex numbers and hypercomplex numbers, which serve as the basis of mathematical natural sciences and many technologies. The new systems of multi-dimensional numbers have interesting mathematical properties and are called in a general case as "systems of united-hypercomplex numbers" (or briefly "U-hypercomplex numbers"). They can be widely used in models of multi-parametrical systems in the field of algebraic biology, artificial life, devices of biological inspired artificial intelligence, etc. In particular, an application of U-hypercomplex numbers reveals hidden properties of genetic alphabets under cyclic permutations in their doublets and triplets. A special attention is devoted to the author's hypothesis about a multi-linguistic in DNA-sequences in a relation with an ensemble of U-numerical sub-alphabets. Genetic multi-linguistic is considered as an important factor to provide noise-immunity properties of the multi-channel genetic coding. Our results attest to the conformity of the algebraic properties of the U-numerical systems with phenomenological properties of the DNA-alphabets and with the complementary device of the double DNA-helix. It seems that in the modeling field of algebraic biology the genetic-informational organization of living bodies can be considered as a set of united-hypercomplex numbers in some association with the famous slogan of Pythagoras "the numbers rule the world". Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Remediating Viking Origins: Genetic Code as Archival Memory of the Remote Past.

    Science.gov (United States)

    Scully, Marc; King, Turi; Brown, Steven D

    2013-10-01

    This article introduces some early data from the Leverhulme Trust-funded research programme, 'The Impact of the Diasporas on the Making of Britain: evidence, memories, inventions'. One of the interdisciplinary foci of the programme, which incorporates insights from genetics, history, archaeology, linguistics and social psychology, is to investigate how genetic evidence of ancestry is incorporated into identity narratives. In particular, we investigate how 'applied genetic history' shapes individual and familial narratives, which are then situated within macro-narratives of the nation and collective memories of immigration and indigenism. It is argued that the construction of genetic evidence as a 'gold standard' about 'where you really come from' involves a remediation of cultural and archival memory, in the construction of a 'usable past'. This article is based on initial questionnaire data from a preliminary study of those attending DNA collection sessions in northern England. It presents some early indicators of the perceived importance of being of Viking descent among participants, notes some emerging patterns and considers the implications for contemporary debates on migration, belonging and local and national identity.

  14. Battelle integrity of nuclear piping program. Summary of results and implications for codes/standards

    International Nuclear Information System (INIS)

    Miura, Naoki

    2005-01-01

    The BINP(Battelle Integrity of Nuclear Piping) program was proposed by Battelle to elaborate pipe fracture evaluation methods and to improve LBB and in-service flaw evaluation criteria. The program has been conducted from October 1998 to September 2003. In Japan, CRIEPI participated in the program on behalf of electric utilities and fabricators to catch up the technical backgrounds for possible future revision of LBB and in-service flaw evaluation standards and to investigate the issues needed to be reflected to current domestic standards. A series of the results obtained from the program has been well utilized for the new LBB Regulatory Guide Program by USNRC and for proposal of revised in-service flaw evaluation criteria to the ASME Code Committee. The results were assessed whether they had implications for the existing or future domestic standards. As a result, the impact of many of these issues, which were concerned to be adversely affected to LBB approval or allowable flaw sizes in flaw evaluation criteria, was found to be relatively minor under actual plant conditions. At the same time, some issues that needed to be resolved to address advanced and rational standards in the future were specified. (author)

  15. Video coding standards AVS China, H.264/MPEG-4 PART 10, HEVC, VP6, DIRAC and VC-1

    CERN Document Server

    Rao, K R; Hwang, Jae Jeong

    2014-01-01

    Review by Ashraf A. Kassim, Professor, Department of Electrical & Computer Engineering, and Associate Dean, School of Engineering, National University of Singapore.     The book consists of eight chapters of which the first two provide an overview of various video & image coding standards, and video formats. The next four chapters present in detail the Audio & video standard (AVS) of China, the H.264/MPEG-4 Advanced video coding (AVC) standard, High efficiency video coding (HEVC) standard and the VP6 video coding standard (now VP10) respectively. The performance of the wavelet based Dirac video codec is compared with H.264/MPEG-4 AVC in chapter 7. Finally in chapter 8, the VC-1 video coding standard is presented together with VC-2 which is based on the intra frame coding of Dirac and an outline of a H.264/AVC to VC-1 transcoder.   The authors also present and discuss relevant research literature such as those which document improved methods & techniques, and also point to other related reso...

  16. Study of relationship between radioactivity distribution, contamination burden and quality standard, accommodate energy of code river Yogyakarta

    International Nuclear Information System (INIS)

    Agus Taftazani and Muzakky

    2009-01-01

    Study of relationship between distribution, contamination burden of gross β radioactivity and natural radionuclide in water and sediment sample from 11 observation station Code river to quality standard and maximum capacity of Code river have been done. Natural radio nuclides identification and gross β radioactivity measurement of condensed water, dry and homogeneous sediment powder (past through 100 mesh sieve) samples have been done by using spectrometer and GM counter. Radioactivity data was analyzed descriptive with histogram to show the spreading pattern of data. Contamination burden data, quality standard and maximum capacity of river Code was to descriptive analyzed by line diagram to knowing relationship between contamination burden, quality standard, and maximum capacity of Code river. The observation of water and sediment at 11 observation station show that the emitter natural radionuclides: 210 Pb, 212 Pb, 214 Pb, 226 Ra, 208 Tl, 214 Bi, 228 Ac and 40 K were detected. The analytical result conclusion was that the pattern spread of average activity gross β and were increase from upstream to downstream of the Code river samples. Contamination burden, quality standard and maximum capacity of radionuclide activity of 210 Pb, 212 Pb, 226 Ra and 228 Ac were more smaller than quality standard of river water according to regulation of Nuclear Energy Regulatory Agency 02/Ka-BAPETEN/V-99 concerning quality standard of radioactivity. It’s mean that Code river still in good contamination burden for the four radionuclides. (author)

  17. Photoactivatable Mussel-Based Underwater Adhesive Proteins by an Expanded Genetic Code.

    Science.gov (United States)

    Hauf, Matthias; Richter, Florian; Schneider, Tobias; Faidt, Thomas; Martins, Berta M; Baumann, Tobias; Durkin, Patrick; Dobbek, Holger; Jacobs, Karin; Möglich, Andreas; Budisa, Nediljko

    2017-09-19

    Marine mussels exhibit potent underwater adhesion abilities under hostile conditions by employing 3,4-dihydroxyphenylalanine (DOPA)-rich mussel adhesive proteins (MAPs). However, their recombinant production is a major biotechnological challenge. Herein, a novel strategy based on genetic code expansion has been developed by engineering efficient aminoacyl-transfer RNA synthetases (aaRSs) for the photocaged noncanonical amino acid ortho-nitrobenzyl DOPA (ONB-DOPA). The engineered ONB-DOPARS enables in vivo production of MAP type 5 site-specifically equipped with multiple instances of ONB-DOPA to yield photocaged, spatiotemporally controlled underwater adhesives. Upon exposure to UV light, these proteins feature elevated wet adhesion properties. This concept offers new perspectives for the production of recombinant bioadhesives. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi

    2009-01-01

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  19. From chemical metabolism to life: the origin of the genetic coding process

    Directory of Open Access Journals (Sweden)

    Antoine Danchin

    2017-06-01

    Full Text Available Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life.

  20. Chromatin remodeling: the interface between extrinsic cues and the genetic code?

    Science.gov (United States)

    Ezzat, Shereen

    2008-10-01

    The successful completion of the human genome project ushered a new era of hope and skepticism. However, the promise of finding the fundamental basis of human traits and diseases appears less than fulfilled. The original premise was that the DNA sequence of every gene would allow precise characterization of critical differences responsible for altered cellular functions. The characterization of intragenic mutations in cancers paved the way for early screening and the design of targeted therapies. However, it has also become evident that unmasking genetic codes alone cannot explain the diversity of disease phenotypes within a population. Further, classic genetics has not been able to explain the differences that have been observed among identical twins or even cloned animals. This new reality has re-ignited interest in the field of epigenetics. While traditionally defined as heritable changes that can alter gene expression without affecting the corresponding DNA sequence, this definition has come into question. The extent to which epigenetic change can also be acquired in response to chemical stimuli represents an exciting dimension in the "nature vs nurture" debate. In this review I will describe a series of studies in my laboratory that illustrate the significance of epigenetics and its potential clinical implications.

  1. SECOND ATLAS DOMESTIC STANDARD PROBLEM (DSP-02 FOR A CODE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    YEON-SIK KIM

    2013-12-01

    Full Text Available KAERI (Korea Atomic Energy Research Institute has been operating an integral effect test facility, the Advanced Thermal-Hydraulic Test Loop for Accident Simulation (ATLAS, for transient and accident simulations of advanced pressurized water reactors (PWRs. Using ATLAS, a high-quality integral effect test database has been established for major design basis accidents of the APR1400 plant. A Domestic Standard Problem (DSP exercise using the ATLAS database was promoted to transfer the database to domestic nuclear industries and contribute to improving a safety analysis methodology for PWRs. This 2nd ATLAS DSP (DSP-02 exercise aims at an effective utilization of an integral effect database obtained from ATLAS, the establishment of a cooperation framework among the domestic nuclear industry, a better understanding of the thermal hydraulic phenomena, and an investigation into the possible limitation of the existing best-estimate safety analysis codes. A small break loss of coolant accident with a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. This DSP exercise was performed in an open calculation environment where the integral effect test data was open to participants prior to the code calculations. This paper includes major information of the DSP-02 exercise as well as comparison results between the calculations and the experimental data.

  2. The Ontario Energy Board`s draft standard supply service code: effects on air quality

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, J.; Bjorkquist, S. [Ontario Clean Air Alliance, Toronto, ON (Canada)

    1999-06-29

    The Ontario Clean Air Alliance (OCAA), a coalition of 67 organizations, takes issue with the Ontario Energy Board`s draft document `Standard Supply Service Code`, particularly sections 2.2.2. and 2.5.2 which they claim are not in the public interest unless the Ontario government implements the OCAA`s recommended emission caps. The alliance is of the view that without strict new environmental regulations the proposed Code would encourage the use of coal for electricity generation. Public health, the environment, consumer interests, job creation and promotion of a competitive electricity market would all be jeopardized by this development, the alliance states. The argument is supported by extensive reference to the Final Report of the Ontario Market Design Committee (MDC) which also emphasized the importance of combining the introduction of competition with appropriate environmental regulations, singling out the emission cap and trade program, and recommending that it be launched concurrently with the electricity market opening for competition. The view of the MDC was that public support for restructuring would not be forthcoming in the absence of regulatory measures to control power plant emissions. 25 refs.

  3. Second ATLAS Domestic Standard Problem (DSP-02) For A Code Assessment

    International Nuclear Information System (INIS)

    Kim, Yeonsik; Choi, Kiyong; Cho, Seok; Park, Hyunsik; Kang, Kyungho; Song, Chulhwa; Baek, Wonpil

    2013-01-01

    KAERI (Korea Atomic Energy Research Institute) has been operating an integral effect test facility, the Advanced Thermal-Hydraulic Test Loop for Accident Simulation (ATLAS), for transient and accident simulations of advanced pressurized water reactors (PWRs). Using ATLAS, a high-quality integral effect test database has been established for major design basis accidents of the APR1400 plant. A Domestic Standard Problem (DSP) exercise using the ATLAS database was promoted to transfer the database to domestic nuclear industries and contribute to improving a safety analysis methodology for PWRs. This 2 nd ATLAS DSP (DSP-02) exercise aims at an effective utilization of an integral effect database obtained from ATLAS, the establishment of a cooperation framework among the domestic nuclear industry, a better understanding of the thermal hydraulic phenomena, and an investigation into the possible limitation of the existing best-estimate safety analysis codes. A small break loss of coolant accident with a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. This DSP exercise was performed in an open calculation environment where the integral effect test data was open to participants prior to the code calculations. This paper includes major information of the DSP-02 exercise as well as comparison results between the calculations and the experimental data

  4. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  5. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation.

    Science.gov (United States)

    Pujar, Shashikant; O'Leary, Nuala A; Farrell, Catherine M; Loveland, Jane E; Mudge, Jonathan M; Wallin, Craig; Girón, Carlos G; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; Martin, Fergal J; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Suner, Marie-Marthe; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bruford, Elspeth A; Bult, Carol J; Frankish, Adam; Murphy, Terence; Pruitt, Kim D

    2018-01-04

    The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. Published by Oxford University Press on behalf of Nucleic Acids Research 2017.

  6. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  7. ZAKI a windows-based k sub o standardization code for in-core INAA

    CERN Document Server

    Ojo, J O

    2002-01-01

    A new computer code ZAKI, for k sub o -based INAA standardization, written in Visual Basic for the WINDOWS environment is described. The parameter alpha measuring the deviation of the epithermal neutron spectrum shape from the ideal 1/E shape, and the thermal-to-epithermal flux ratio f, are monitored at each irradiation position for each irradiation using the ''triple bare monitor with k sub o '' technique. Stability of the irradiation position with respect to alpha and f is therefore assumed only for the duration of the irradiation. This now makes it possible to use k sub o standardization even for in-core reactor irradiation channels without an a priori knowledge of alpha and f values as required by existing commercial software. ZAKI is considerably versatile and contains features which allow for use of several detectors at different counting geometries, direct inputting of peak search output from GeniePc, and automatic nuclide identification of all gamma lines using an in-built library. Sample results for ...

  8. ZAKI: a windows-based ko standardization code for in-core INAA

    International Nuclear Information System (INIS)

    Ojo, J.O.; Filby, R.H.

    2002-01-01

    A new computer code ZAKI, for k o -based INAA standardization, written in Visual Basic for the WINDOWS environment is described. The parameter α measuring the deviation of the epithermal neutron spectrum shape from the ideal 1/E shape, and the thermal-to-epithermal flux ratio f, are monitored at each irradiation position for each irradiation using the ''triple bare monitor with k o '' technique. Stability of the irradiation position with respect to α and f is therefore assumed only for the duration of the irradiation. This now makes it possible to use k o standardization even for in-core reactor irradiation channels without an a priori knowledge of α and f values as required by existing commercial software. ZAKI is considerably versatile and contains features which allow for use of several detectors at different counting geometries, direct inputting of peak search output from GeniePc, and automatic nuclide identification of all gamma lines using an in-built library. Sample results for two certified reference materials are presented

  9. Role of horizontal gene transfer as a control on the coevolution of ribosomal proteins and the genetic code

    Energy Technology Data Exchange (ETDEWEB)

    Woese, Carl R.; Goldenfeld, Nigel; Luthey-Schulten, Zaida

    2011-03-31

    Our main goal is to develop the conceptual and computational tools necessary to understand the evolution of the universal processes of translation and replication and to identify events of horizontal gene transfer that occurred within the components. We will attempt to uncover the major evolutionary transitions that accompanied the development of protein synthesis by the ribosome and associated components of the translation apparatus. Our project goes beyond standard genomic approaches to explore homologs that are represented at both the structure and sequence level. Accordingly, use of structural phylogenetic analysis allows us to probe further back into deep evolutionary time than competing approaches, permitting greater resolution of primitive folds and structures. Specifically, our work focuses on the elements of translation, ranging from the emergence of the canonical genetic code to the evolution of specific protein folds, mediated by the predominance of horizontal gene transfer in early life. A unique element of this study is the explicit accounting for the impact of phenotype selection on translation, through a coevolutionary control mechanism. Our work contributes to DOE mission objectives through: (1) sophisticated computer simulation of protein dynamics and evolution, and the further refinement of techniques for structural phylogeny, which complement sequence information, leading to improved annotation of genomic databases; (2) development of evolutionary approaches to exploring cellular function and machinery in an integrated way; and (3) documentation of the phenotype interaction with translation over evolutionary time, reflecting the system response to changing selection pressures through horizontal gene transfer.

  10. Use of fluorescent proteins and color-coded imaging to visualize cancer cells with different genetic properties.

    Science.gov (United States)

    Hoffman, Robert M

    2016-03-01

    Fluorescent proteins are very bright and available in spectrally-distinct colors, enable the imaging of color-coded cancer cells growing in vivo and therefore the distinction of cancer cells with different genetic properties. Non-invasive and intravital imaging of cancer cells with fluorescent proteins allows the visualization of distinct genetic variants of cancer cells down to the cellular level in vivo. Cancer cells with increased or decreased ability to metastasize can be distinguished in vivo. Gene exchange in vivo which enables low metastatic cancer cells to convert to high metastatic can be color-coded imaged in vivo. Cancer stem-like and non-stem cells can be distinguished in vivo by color-coded imaging. These properties also demonstrate the vast superiority of imaging cancer cells in vivo with fluorescent proteins over photon counting of luciferase-labeled cancer cells.

  11. VALIDATION OF SIMBAT-PWR USING STANDARD CODE OF COBRA-EN ON REACTOR TRANSIENT CONDITION

    Directory of Open Access Journals (Sweden)

    Muhammad Darwis Isnaini

    2016-03-01

    Full Text Available The validation of Pressurized Water Reactor typed Nuclear Power Plant simulator developed by BATAN (SIMBAT-PWR using standard code of COBRA-EN on reactor transient condition has been done. The development of SIMBAT-PWR has accomplished several neutronics and thermal-hydraulic calculation modules. Therefore, the validation of the simulator is needed, especially in transient reactor operation condition. The research purpose is for characterizing the thermal-hydraulic parameters of PWR1000 core, which be able to be applied or as a comparison in developing the SIMBAT-PWR. The validation involves the calculation of the thermal-hydraulic parameters using COBRA-EN code. Furthermore, the calculation schemes are based on COBRA-EN with fixed material properties and dynamic properties that calculated by MATPRO subroutine (COBRA-EN+MATPRO for reactor condition of startup, power rise and power fluctuation from nominal to over power. The comparison of the temperature distribution at nominal 100% power shows that the fuel centerline temperature calculated by SIMBAT-PWR has 8.76% higher result than COBRA-EN result and 7.70% lower result than COBRA-EN+MATPRO. In general, SIMBAT-PWR calculation results on fuel temperature distribution are mostly between COBRA-EN and COBRA-EN+MATPRO results. The deviations of the fuel centerline, fuel surface, inner and outer cladding as well as coolant bulk temperature in the SIMBAT-PWR and the COBRA-EN calculation, are due to the value difference of the gap heat transfer coefficient and the cladding thermal conductivity.

  12. Authorization request for potential non-compliance with the American Standard Safety Code for Elevators Dumbwaiters and Escalators

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, J.E.

    1964-09-28

    A Third Party inspection of the reactor work platforms was conducted by representatives of the Travelers Insurance Company in 1958. An inspection report submitted by these representatives described hazardous conditions noted and presented a series of recommendations to improve the operational safety of the systems. Project CGI-960, ``C`` & ``D`` Work Platform Safety Improvements -- All Reactors, vas initiated to modify the platforms in compliance with the Third Party recommendations. The American Standard Safety Code for Elevators Dumbwaiters and Escalators (A-17.1) is used as a guide by the Third Party in formulating their recommendations. This code is used because there is no other applicable code for this type of equipment. While the work platforms do not and in some cases can not comply with this code because of operational use, every effort is made to comply with the intent of the code.

  13. Inventory of Safety-related Codes and Standards for Energy Storage Systems with some Experiences related to Approval and Acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.

    2014-09-11

    The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.

  14. Maximization Network Throughput Based on Improved Genetic Algorithm and Network Coding for Optical Multicast Networks

    Science.gov (United States)

    Wei, Chengying; Xiong, Cuilian; Liu, Huanlin

    2017-12-01

    Maximal multicast stream algorithm based on network coding (NC) can improve the network's throughput for wavelength-division multiplexing (WDM) networks, which however is far less than the network's maximal throughput in terms of theory. And the existing multicast stream algorithms do not give the information distribution pattern and routing in the meantime. In the paper, an improved genetic algorithm is brought forward to maximize the optical multicast throughput by NC and to determine the multicast stream distribution by hybrid chromosomes construction for multicast with single source and multiple destinations. The proposed hybrid chromosomes are constructed by the binary chromosomes and integer chromosomes, while the binary chromosomes represent optical multicast routing and the integer chromosomes indicate the multicast stream distribution. A fitness function is designed to guarantee that each destination can receive the maximum number of decoding multicast streams. The simulation results showed that the proposed method is far superior over the typical maximal multicast stream algorithms based on NC in terms of network throughput in WDM networks.

  15. Physicochemical basis for the origin of the genetic code - Lecture 3

    International Nuclear Information System (INIS)

    Ponnamperuma, C.

    1992-01-01

    A study of the association of homocodonic amino acids and selected heterocodonic amino acids with selected nucleotides in aqueous solution was undertaken to examine a possible physical basis for the origin of codon assignments. These interactions were studied using 1H nuclear magnetic resonance spectroscopy (NMR). Association constants for the various interactions were determined by fitting the changes in the chemical shifts of the anomeric and ring protons of the nucleoside moieties as a function of amino acid concentration to an isotherm which described the binding interaction. The strongest association of all homocodonic amino acids were with their respective anticodonic nucleotide sequences. The strength of association was seen to increase with increase in the chain length of the anticodonic nucleotide. The association of these amino acids with different phosphate esters of nucleotides suggests that a definite isomeric structure is required for association with a specified amino acid; the 5'-mononucleotides and (3'-5')-linked dinucleotides are the favored geometries for strong associations. Use of heterocodonic amino acids and nonprotein amino acids supports these findings. We conclude that there is at least a physicochemical, anticodonic contribution to the origin of the genetic code. (author)

  16. Enhancement of combined heat and power economic dispatch using self adaptive real-coded genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Subbaraj, P. [Kalasalingam University, Srivilliputhur, Tamilnadu 626 190 (India); Rengaraj, R. [Electrical and Electronics Engineering, S.S.N. College of Engineering, Old Mahabalipuram Road, Thirupporur (T.K), Kalavakkam, Kancheepuram (Dist.) 603 110, Tamilnadu (India); Salivahanan, S. [S.S.N. College of Engineering, Old Mahabalipuram Road, Thirupporur (T.K), Kalavakkam, Kancheepuram (Dist.) 603 110, Tamilnadu (India)

    2009-06-15

    In this paper, a self adaptive real-coded genetic algorithm (SARGA) is implemented to solve the combined heat and power economic dispatch (CHPED) problem. The self adaptation is achieved by means of tournament selection along with simulated binary crossover (SBX). The selection process has a powerful exploration capability by creating tournaments between two solutions. The better solution is chosen and placed in the mating pool leading to better convergence and reduced computational burden. The SARGA integrates penalty parameterless constraint handling strategy and simultaneously handles equality and inequality constraints. The population diversity is introduced by making use of distribution index in SBX operator to create a better offspring. This leads to a high diversity in population which can increase the probability towards the global optimum and prevent premature convergence. The SARGA is applied to solve CHPED problem with bounded feasible operating region which has large number of local minima. The numerical results demonstrate that the proposed method can find a solution towards the global optimum and compares favourably with other recent methods in terms of solution quality, handling constraints and computation time. (author)

  17. A Stress-Induced Bias in the Reading of the Genetic Code in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Adi Oron-Gottesman

    2016-11-01

    Full Text Available Escherichia coli mazEF is an extensively studied stress-induced toxin-antitoxin (TA system. The toxin MazF is an endoribonuclease that cleaves RNAs at ACA sites. Thereby, under stress, the induced MazF generates a stress-induced translation machinery (STM, composed of MazF-processed mRNAs and selective ribosomes that specifically translate the processed mRNAs. Here, we further characterized the STM system, finding that MazF cleaves only ACA sites located in the open reading frames of processed mRNAs, while out-of-frame ACAs are resistant. This in-frame ACA cleavage of MazF seems to depend on MazF binding to an extracellular-death-factor (EDF-like element in ribosomal protein bS1 (bacterial S1, apparently causing MazF to be part of STM ribosomes. Furthermore, due to the in-frame MazF cleavage of ACAs under stress, a bias occurs in the reading of the genetic code causing the amino acid threonine to be encoded only by its synonym codon ACC, ACU, or ACG, instead of by ACA.

  18. The role of crossover operator in evolutionary-based approach to the problem of genetic code optimization.

    Science.gov (United States)

    Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł

    2016-12-01

    One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure

  19. Orion: Detecting regions of the human non-coding genome that are intolerant to variation using population genetics.

    Science.gov (United States)

    Gussow, Ayal B; Copeland, Brett R; Dhindsa, Ryan S; Wang, Quanli; Petrovski, Slavé; Majoros, William H; Allen, Andrew S; Goldstein, David B

    2017-01-01

    There is broad agreement that genetic mutations occurring outside of the protein-coding regions play a key role in human disease. Despite this consensus, we are not yet capable of discerning which portions of non-coding sequence are important in the context of human disease. Here, we present Orion, an approach that detects regions of the non-coding genome that are depleted of variation, suggesting that the regions are intolerant of mutations and subject to purifying selection in the human lineage. We show that Orion is highly correlated with known intolerant regions as well as regions that harbor putatively pathogenic variation. This approach provides a mechanism to identify pathogenic variation in the human non-coding genome and will have immediate utility in the diagnostic interpretation of patient genomes and in large case control studies using whole-genome sequences.

  20. Quantum Genetics in terms of Quantum Reversible Automata and Quantum Computation of Genetic Codes and Reverse Transcription

    CERN Document Server

    Baianu,I C

    2004-01-01

    The concepts of quantum automata and quantum computation are studied in the context of quantum genetics and genetic networks with nonlinear dynamics. In previous publications (Baianu,1971a, b) the formal concept of quantum automaton and quantum computation, respectively, were introduced and their possible implications for genetic processes and metabolic activities in living cells and organisms were considered. This was followed by a report on quantum and abstract, symbolic computation based on the theory of categories, functors and natural transformations (Baianu,1971b; 1977; 1987; 2004; Baianu et al, 2004). The notions of topological semigroup, quantum automaton, or quantum computer, were then suggested with a view to their potential applications to the analogous simulation of biological systems, and especially genetic activities and nonlinear dynamics in genetic networks. Further, detailed studies of nonlinear dynamics in genetic networks were carried out in categories of n-valued, Lukasiewicz Logic Algebra...

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  2. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  3. A Comprehensive Analysis of High School Genetics Standards: Are States Keeping Pace with Modern Genetics?

    Science.gov (United States)

    Dougherty, M. J.; Pleasants, C.; Solow, L.; Wong, A.; Zhang, H.

    2011-01-01

    Science education in the United States will increasingly be driven by testing and accountability requirements, such as those mandated by the No Child Left Behind Act, which rely heavily on learning outcomes, or "standards," that are currently developed on a state-by-state basis. Those standards, in turn, drive curriculum and instruction.…

  4. The aminoacyl-tRNA synthetases had only a marginal role in the origin of the organization of the genetic code: Evidence in favor of the coevolution theory.

    Science.gov (United States)

    Di Giulio, Massimo

    2017-11-07

    The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal

  5. French codes and standards for design, construction and in-service inspection of nuclear power plants

    International Nuclear Information System (INIS)

    Hugot, G.; Grandemange, J. M.

    1995-01-01

    In 1970, France decided that its future power plants would be of the Pressurized Water Reactor type. This choice proved to be successful since it resulted in more than 60 PWR units in operation or under construction in France and abroad. At the beginning of such a program, the French engineering and manufacturing industry, the national electrical utility and the Safety Authorities had to face the many challenges imposed by the implementation of an imported technology. The government reorganised the licensing process. FRAMATOME, the NSSS vendor, and EDF (Electricite de France), the national utility, decided to create 'AFCEN', the French Association for Design and Construction Rules for Nuclear Island Components. These rules, the RCC's (Regles de Construction et de conception), which are approved by French Safety Authorities deal with mechanical and electrical equipment as well as with nuclear fuel and civil works. They are now being supplemented by in service inspection rules, the RSE's (Regles d'inspection en Service). The paper presents these Codes and their main updating following experience of application, technical progress and evolution of standards. Status of discussion concerning reference to European standardisation and developments of rules applicable to the EPR project will also be discussed

  6. Hospital Standardized Mortality Ratios: Sensitivity Analyses on the Impact of Coding

    Science.gov (United States)

    Bottle, Alex; Jarman, Brian; Aylin, Paul

    2011-01-01

    Introduction Hospital standardized mortality ratios (HSMRs) are derived from administrative databases and cover 80 percent of in-hospital deaths with adjustment for available case mix variables. They have been criticized for being sensitive to issues such as clinical coding but on the basis of limited quantitative evidence. Methods In a set of sensitivity analyses, we compared regular HSMRs with HSMRs resulting from a variety of changes, such as a patient-based measure, not adjusting for comorbidity, not adjusting for palliative care, excluding unplanned zero-day stays ending in live discharge, and using more or fewer diagnoses. Results Overall, regular and variant HSMRs were highly correlated (ρ > 0.8), but differences of up to 10 points were common. Two hospitals were particularly affected when palliative care was excluded from the risk models. Excluding unplanned stays ending in same-day live discharge had the least impact despite their high frequency. The largest impacts were seen when capturing postdischarge deaths and using just five high-mortality diagnosis groups. Conclusions HSMRs in most hospitals changed by only small amounts from the various adjustment methods tried here, though small-to-medium changes were not uncommon. However, the position relative to funnel plot control limits could move in a significant minority even with modest changes in the HSMR. PMID:21790587

  7. PHITS code improvements by Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority

    International Nuclear Information System (INIS)

    Goko, Shinji

    2017-01-01

    As for the safety analysis to be carried out when a nuclear power company applies for installation permission of facility or equipment, business license, design approval etc., the Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority continuously conducts safety research for the introduction of various technologies and their improvement in order to evaluate the adequacy of this safety analysis. In the field of the shielding analysis of nuclear fuel transportation materials, this group improved the code to make PHITS applicable to this field, and has been promoting the improvement as a tool used for regulations since FY2013. This paper introduced the history and progress of this safety research. PHITS 2.88, which is the latest version as of November 2016, was equipped with the automatic generation function of variance reduction parameters [T-WWG] etc., and developed as the tool equipped with many effective functions in practical application to nuclear power regulations. In addition, this group conducted the verification analysis against nuclear fuel packages, which showed a good agreement with the analysis by MCNP, which is extensively used worldwide and abundant in actual results. It also shows a relatively good agreement with the measured values, when considering differences in analysis and measurement. (A.O.)

  8. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    Science.gov (United States)

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  9. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    International Nuclear Information System (INIS)

    Binh, Do Quang; Huy, Ngo Quang; Hai, Nguyen Hoang

    2014-01-01

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  10. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    Energy Technology Data Exchange (ETDEWEB)

    Binh, Do Quang [University of Technical Education Ho Chi Minh City (Viet Nam); Huy, Ngo Quang [University of Industry Ho Chi Minh City (Viet Nam); Hai, Nguyen Hoang [Centre for Research and Development of Radiation Technology, Ho Chi Minh City (Viet Nam)

    2014-12-15

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  11. Views of Evidence-Based Practice: Social Workers' Code of Ethics and Accreditation Standards as Guides for Choice

    Science.gov (United States)

    Gambrill, Eileen

    2007-01-01

    Different views of evidence-based practice (EBP) include defining it as the use of empirically-validated treatments and practice guidelines (i.e., the EBPs approach) in contrast to the broad philosophy and related evolving process described by the originators. Social workers can draw on their code of ethics and accreditation standards both to…

  12. NODC Standard Product: NODC Taxonomic Code on CD-ROM (NODC Accession 0050418)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The content of the NODC Taxonomic Code, Version 8 CD-ROM (CD-ROM NODC-68) distributed by NODC is archived in this accession. Version 7 of the NODC Taxonomic Code...

  13. Innovation and Standardization in School Building: A Proposal for the National Code in Italy.

    Science.gov (United States)

    Ridolfi, Giuseppe

    This document discusses the University of Florence's experience and concepts as it developed the research to define a proposal for designing a new national school building code. Section 1 examines the current school building code and the Italian Reform Process in Education between 1960 and 2000. Section 2 details and explains the new school…

  14. Network Coding to Enhance Standard Routing Protocols in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Fitzek, Frank

    2013-01-01

    This paper introduces a design and simulation of a locally optimized network coding protocol, called PlayNCool, for wireless mesh networks. PlayN-Cool is easy to implement and compatible with existing routing protocols and devices. This allows the system to gain from network coding capabilities i...

  15. WIAMan Technology Demonstrator Sensor Codes Conforming to International Organization for Standardization/Technical Standard (ISO/TS) 13499

    Science.gov (United States)

    2016-03-01

    of this collection of information, including suggestions for reducing  the burden, to Department of Defense, Washington Headquarters  Services ...ISO)- Multimedia Exchange task force is responsible for maintaining the specification for the multimedia data exchange format for impact tests outlined...channel codes, ATD, multimedia exchange format, ISO/TS 13499 36 Michael B Tegtmeyer 410-278-6074Unclassified Unclassified Unclassified UU ii

  16. FitSKIRT: genetic algorithms to automatically fit dusty galaxies with a Monte Carlo radiative transfer code

    Science.gov (United States)

    De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.

    2013-02-01

    We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.

  17. A Code of Ethics and Standards for Outer-Space Commerce

    Science.gov (United States)

    Livingston, David M.

    2002-01-01

    Now is the time to put forth an effective code of ethics for businesses in outer space. A successful code would be voluntary and would actually promote the growth of individual companies, not hinder their efforts to provide products and services. A properly designed code of ethics would ensure the development of space commerce unfettered by government-created barriers. Indeed, if the commercial space industry does not develop its own professional code of ethics, government- imposed regulations would probably be instituted. Should this occur, there is a risk that the development of off-Earth commerce would become more restricted. The code presented in this paper seeks to avoid the imposition of new barriers to space commerce as well as make new commercial space ventures easier to develop. The proposed code consists of a preamble, which underscores basic values, followed by a number of specific principles. For the most part, these principles set forth broad commitments to fairness and integrity with respect to employees, consumers, business transactions, political contributions, natural resources, off-Earth development, designated environmental protection zones, as well as relevant national and international laws. As acceptance of this code of ethics grows within the industry, general modifications will be necessary to accommodate the different types of businesses entering space commerce. This uniform applicability will help to assure that the code will not be perceived as foreign in nature, potentially restrictive, or threatening. Companies adopting this code of ethics will find less resistance to their space development plans, not only in the United States but also from nonspacefaring nations. Commercial space companies accepting and refining this code would demonstrate industry leadership and an understanding that will serve future generations living, working, and playing in space. Implementation of the code would also provide an off-Earth precedent for a modified

  18. Analyses in Support of Risk-Informed Natural Gas Vehicle Maintenance Facility Codes and Standards: Phase II.

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca; Ehrhart, Brian David

    2018-03-01

    Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behavior of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.

  19. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  2. Population-genetic approach to standardization of radiation and non-radiation factors

    International Nuclear Information System (INIS)

    Telnov, I.

    2006-01-01

    population level. Of 65 analyses of association between diseases and unfavorable effects and various genetic polymorphic systems, 27 had negative results. Other 38 had significant, i.e. positive results. Respective G.S.R.R. varied accordingly in the range from 1.2 to 2.5. Averaged G.S.R.R. for some genetic systems ranged from 1.4 to 1.9. More stable and closer values of averaged G.S.R.R. calculated for various categories of effects: pathologies due to radiation and non-radiation factors - 1.51; non-tumor (1,47) and tumor (1,54) diseases; average life expectancy - 1.34. Population-averaged or integral value of G.S.R.R. was about 1.5. This value can be used as genetic predisposition coefficient (C.G.P.) for correction in averaging of environmental population level factors. Such correction can be done by decreasing of permissible standard value by the value of C.G.P. to calculate population-genetic standard. It should be noted that population-genetic standards decrease risk of development of unfavorable consequences due to effect of environmental factors in individuals with genetic predisposition to the general population level. An important advantage of this approach is that there is no need to account for all existing variations of genetic predisposition to multiform unfavorable environmental factors

  3. Recommendations for codes and standards to be used for design and fabrication of high level waste canister

    International Nuclear Information System (INIS)

    Bermingham, A.J.; Booker, R.J.; Booth, H.R.; Ruehle, W.G.; Shevekov, S.; Silvester, A.G.; Tagart, S.W.; Thomas, J.A.; West, R.G.

    1978-01-01

    This study identifies codes, standards, and regulatory requirements for developing design criteria for high-level waste (HLW) canisters for commercial operation. It has been determined that the canister should be designed as a pressure vessel without provision for any overpressure protection type devices. It is recommended that the HLW canister be designed and fabricated to the requirements of the ASME Section III Code, Division 1 rules, for Code Class 3 components. Identification of other applicable industry and regulatory guides and standards are provided in this report. Requirements for the Design Specification are found in the ASME Section III Code. It is recommended that design verification be conducted principally with prototype testing which will encompass normal and accident service conditions during all phases of the canister life. Adequacy of existing quality assurance and licensing standards for the canister was investigated. One of the recommendations derived from this study is a requirement that the canister be N stamped. In addition, acceptance standards for the HLW waste should be established and the waste qualified to those standards before the canister is sealed. A preliminary investigation of use of an overpack for the canister has been made, and it is concluded that the use of an overpack, as an integral part of overall canister design, is undesirable, both from a design and economics standpoint. However, use of shipping cask liners and overpack type containers at the Federal repository may make the canister and HLW management safer and more cost effective. There are several possible concepts for canister closure design. These concepts can be adapted to the canister with or without an overpack. A remote seal weld closure is considered to be one of the most suitable closure methods; however, mechanical seals should also be investigated

  4. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  5. Overview of Development and Deployment of Codes, Standards and Regulations Affecting Energy Storage System Safety in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.

    2014-08-22

    This report acquaints stakeholders and interested parties involved in the development and/or deployment of energy storage systems (ESS) with the subject of safety-related codes, standards and regulations (CSRs). It is hoped that users of this document gain a more in depth and uniform understanding of safety-related CSR development and deployment that can foster improved communications among all ESS stakeholders and the collaboration needed to realize more timely acceptance and approval of safe ESS technology through appropriate CSR.

  6. Advocacy and Accessibility Standards in the New "Code of Professional Ethics for Rehabilitation Counselors"

    Science.gov (United States)

    Waldmann, Ashley K.; Blackwell, Terry L.

    2010-01-01

    This article addresses the changes in the Commission on Rehabilitation Counselor Certification's 2010 "Code of Professional Ethics for Rehabilitation Counselors" as they relate to Section C: Advocacy and Accessibility. Ethical issues are identified and discussed in relation to advocacy skills and to advocacy with, and on behalf of, the client; to…

  7. The Teaching of the Code of Ethics and Standard Practices for Texas Educator Preparation Programs

    Science.gov (United States)

    Davenport, Marvin; Thompson, J. Ray; Templeton, Nathan R.

    2015-01-01

    The purpose of this descriptive quantitative research study was to answer three basic informational questions: (1) To what extent ethics training, as stipulated in Texas Administrative Code Chapter 247, was included in the EPP curriculum; (2) To what extent Texas public universities with approved EPP programs provided faculty opportunities for…

  8. MILSTAMP TACs: Military Standard Transportation and Movement Procedures Transportation Account Codes. Volume 2

    Science.gov (United States)

    1987-02-15

    82302 F 13211 PT VERDE WPB 82311 F 13212 PT SWIFT WPB 82312 E. 13214 PT THATCHER WPB 82314 E 13218 PT HERRON WPB 82318 C 13232 PT ROBERTS WPB 82332 E...Identifies DOT, FAA Logistica Center, OkIanhea City, as an organization to be billed. 4th Position Code A Ia assigned by DOT, rAA. Identifies appropriation

  9. Prototype of a Standards-Based EHR and Genetic Test Reporting Tool Coupled with HL7-Compliant Infobuttons

    Science.gov (United States)

    Crump, Jacob K.; Del Fiol, Guilherme; Williams, Marc S.; Freimuth, Robert R.

    2018-01-01

    Integration of genetic information is becoming increasingly important in clinical practice. However, genetic information is often ambiguous and difficult to understand, and clinicians have reported low-self-efficacy in integrating genetics into their care routine. The Health Level Seven (HL7) Infobutton standard helps to integrate online knowledge resources within Electronic Health Records (EHRs) and is required for EHR certification in the US. We implemented a prototype of a standards-based genetic reporting application coupled with infobuttons leveraging the Infobutton and Fast Healthcare Interoperability Resources (FHIR) Standards. Infobutton capabilities were provided by Open Infobutton, an open source package compliant with the HL7 Infobutton Standard. The resulting prototype demonstrates how standards-based reporting of genetic results, coupled with curated knowledge resources, can provide dynamic access to clinical knowledge on demand at the point of care. The proposed functionality can be enabled within any EHR system that has been certified through the US Meaningful Use program.

  10. Standards for the Reporting of Genetic Counseling Interventions in Research and Other Studies (GCIRS): an NSGC Task Force Report.

    Science.gov (United States)

    Hooker, Gillian W; Babu, D; Myers, M F; Zierhut, H; McAllister, M

    2017-06-01

    As the demand for evidence to support the value of genetic counseling increases, it is critical that reporting of genetic counseling interventions in research and other types of studies (e.g. process improvement or service evaluation studies) adopt greater rigor. As in other areas of healthcare, the appraisal, synthesis, and translation of research findings into genetic counseling practice are likely to be improved if clear specifications of genetic counseling interventions are reported when studies involving genetic counseling are published. To help improve reporting practices, the National Society of Genetic Counselors (NSGC) convened a task force in 2015 to develop consensus standards for the reporting of genetic counseling interventions. Following review by the NSGC Board of Directors, the NSGC Practice Guidelines Committee and the editorial board of the Journal of Genetic Counseling, 23 items across 8 domains were proposed as standards for the reporting of genetic counseling interventions in the published literature (GCIRS: Genetic Counseling Intervention Reporting Standards). The authors recommend adoption of these standards by authors and journals when reporting studies involving genetic counseling interventions.

  11. Breaking the code: Statistical methods and methodological issues in psychiatric genetics

    NARCIS (Netherlands)

    Stringer, S.

    2015-01-01

    The genome-wide association (GWA) era has confirmed the heritability of many psychiatric disorders, most notably schizophrenia. Thousands of genetic variants with individually small effect sizes cumulatively constitute a large contribution to the heritability of psychiatric disorders. This thesis

  12. [Assisted reproduction and artificial insemination and genetic manipulation in the Criminal Code of the Federal District, Mexico].

    Science.gov (United States)

    Brena Sesma, Ingrid

    2004-01-01

    The article that one presents has for purpose outline and comment on the recent modifications to the Penal Code for the Federal District of México which establish, for the first time, crimes related to the artificial procreation and to the genetic manipulation. Also one refers to the interaction of the new legal texts with the sanitary legislation of the country. Since it will be stated in some cases they present confrontations between the penal and the sanitary reglamentation and some points related to the legality or unlawfulness of a conduct that stayed without the enough development. These lacks will complicate the application of the new rules of the Penal Code of the Federal District.

  13. Genetic variants in long non-coding RNA MIAT contribute to risk of paranoid schizophrenia in a Chinese Han population.

    Science.gov (United States)

    Rao, Shu-Quan; Hu, Hui-Ling; Ye, Ning; Shen, Yan; Xu, Qi

    2015-08-01

    The heritability of schizophrenia has been reported to be as high as ~80%, but the contribution of genetic variants identified to this heritability remains to be estimated. Long non-coding RNAs (LncRNAs) are involved in multiple processes critical to normal cellular function and dysfunction of lncRNA MIAT may contribute to the pathophysiology of schizophrenia. However, the genetic evidence of lncRNAs involved in schizophrenia has not been documented. Here, we conducted a two-stage association analysis on 8 tag SNPs that cover the whole MIAT locus in two independent Han Chinese schizophrenia case-control cohorts (discovery sample from Shanxi Province: 1093 patients with paranoid schizophrenia and 1180 control subjects; replication cohort from Jilin Province: 1255 cases and 1209 healthy controls). In discovery stage, significant genetic association with paranoid schizophrenia was observed for rs1894720 (χ(2)=74.20, P=7.1E-18), of which minor allele (T) had an OR of 1.70 (95% CI=1.50-1.91). This association was confirmed in the replication cohort (χ(2)=22.66, P=1.9E-06, OR=1.32, 95%CI 1.18-1.49). Besides, a weak genotypic association was detected for rs4274 (χ(2)=4.96, df=2, P=0.03); the AA carriers showed increased disease risk (OR=1.30, 95%CI=1.03-1.64). No significant association was found between any haplotype and paranoid schizophrenia. The present studies showed that lncRNA MIAT was a novel susceptibility gene for paranoid schizophrenia in the Chinese Han population. Considering that most lncRNAs locate in non-coding regions, our result may explain why most susceptibility loci for schizophrenia identified by genome wide association studies were out of coding regions. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. The development and standardization of testing methods for genetically modified organisms and their derived products.

    Science.gov (United States)

    Zhang, Dabing; Guo, Jinchao

    2011-07-01

    As the worldwide commercialization of genetically modified organisms (GMOs) increases and consumers concern the safety of GMOs, many countries and regions are issuing labeling regulations on GMOs and their products. Analytical methods and their standardization for GM ingredients in foods and feed are essential for the implementation of labeling regulations. To date, the GMO testing methods are mainly based on the inserted DNA sequences and newly produced proteins in GMOs. This paper presents an overview of GMO testing methods as well as their standardization. © 2011 Institute of Botany, Chinese Academy of Sciences.

  15. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  16. An enhancement of selection and crossover operations in real-coded genetic algorithm for large-dimensionality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Noh Sung; Lee, Jongsoo [Yonsei University, Seoul (Korea, Republic of)

    2016-01-15

    The present study aims to implement a new selection method and a novel crossover operation in a real-coded genetic algorithm. The proposed selection method facilitates the establishment of a successively evolved population by combining several subpopulations: an elitist subpopulation, an off-spring subpopulation and a mutated subpopulation. A probabilistic crossover is performed based on the measure of probabilistic distance between the individuals. The concept of ‘allowance’ is suggested to describe the level of variance in the crossover operation. A number of nonlinear/non-convex functions and engineering optimization problems are explored to verify the capacities of the proposed strategies. The results are compared with those obtained from other genetic and nature-inspired algorithms.

  17. 75 FR 66735 - National Fire Protection Association (NFPA): Request for Comments on NFPA's Codes and Standards

    Science.gov (United States)

    2010-10-29

    ... 59A Standard for the P Production, Storage, and Handling of Liquefied Natural Gas (LNG). NFPA 75... Horizontally in Fire Resistance-Rated Floor Systems. NFPA 385 Standard for Tank P Vehicles for Flammable and Combustible Liquids. NFPA 497 Recommended Practice P for the Classification of Flammable Liquids, Gases, or...

  18. Croatia - Report on the Observance of Standards and Codes : Accounting and Auditing

    OpenAIRE

    World Bank

    2007-01-01

    This report provides an updated assessment of accounting, financial reporting, and auditing requirements and practices within the enterprise and financial sectors in Croatia. It uses International Financial Reporting Standards (IFRS), International Standards on Auditing (ISA), and the relevant portions of European Union (EU) law (also known as the acquis communautaire). Croatia has made co...

  19. Revision of AESJ standard 'the code of implemnetation of periodic safety review of nuclear power plants'

    International Nuclear Information System (INIS)

    Hirano, Masashi; Narumiya, Yoshiyuki

    2010-01-01

    The Periodic Safety Review (PSR) was launched in June 1992, when the Agency for Natural Resources and Energy issued a notification that required licensees to conduct comprehensive review on the safety of each existing nuclear power plant (NPP) once approximately every ten years based on the latest technical findings for the purpose of improving the safety of the NPP. In 2006, the Standard Committee of the Atomic Energy Society of Japan established the first version of 'The Standard of Implementation for Periodic Safety Review of Nuclear Power Plants: 2006'. Taking into account developments in safety regulation of PSR after the issuance of the first version, the Standard Committee has revised the Standard. This paper summarizes background on PSR, such developments are major contents of the Standard as well as the focal points of the revision. (author)

  20. Mongolia; Report on the Observance of Standards and Codes-Fiscal Transparency

    OpenAIRE

    International Monetary Fund

    2001-01-01

    This report provides an assessment of fiscal transparency practices in Mongolia against the requirements of the IMF Code of Good Practices on Fiscal Transparency. This paper analyzes the government's participation in the financial and nonfinancial sectors of the economy. Executive Directors appreciated the achievements, and stressed the need for improvements in the areas of fiscal transparency. They emphasized the need for addressing weaknesses of fiscal data, maintaining a legal framework fo...

  1. Quality assurance of the French nuclear market - IAEA code and standardization

    International Nuclear Information System (INIS)

    Pavaux, F.

    1980-06-01

    The fact that Quality Assurance was imported from abroad and our reticence to reach agreement on single and accurate texts explain, if not excuse, the abundance of reference requirements existing on the French nuclear market with respect to Quality Assurance Programmes. But all is not lost, since the IAEA Good Practice Code is perhaps the solution that, in a few years time, will enable all French industrialists to work and be assessed by their customers, according to the same reference text [fr

  2. The Canadian approach to nuclear codes and standards. A CSA forum for development of standards for CANDU: radioactive waste management and decommissioning

    International Nuclear Information System (INIS)

    Shin, T.; Azeez, S.; Dua, S.

    2006-01-01

    Together with the Canadian Standards Association (CSA), industry stakeholders, governments, and the public have developed a suite of standards for CANDU nuclear power plants that generate electricity in Canada and abroad. In this paper, we will describe: CSA's role in national and international nuclear standards development; the key issues and priority projects that the nuclear standards program has addressed; the new CSA nuclear committees and projects being established, particularly those related to waste management and decommissioning; the hierarchy of nuclear regulations, nuclear, and other standards in Canada, and how they are applied by AECL; the standards management activities; and the future trends and challenges for CSA and the nuclear community. CSA is an accredited Standards Development Organization (SDO) and part of the international standards system. CSA's Nuclear Strategic Steering Committee (NSSC) provides leadership, direction, and support for a standards committee hierarchy comprised of members from a balanced matrix of interests. The NSSC strategically focuses on industry challenges; a new nuclear regulatory system, deregulated energy markets, and industry restructuring. As the first phase of priority projects is nearing completion, the next phase of priorities is being identified. These priorities address radioactive waste management, environmental radiation management, decommissioning, structural, and seismic issues. As the CSA committees get established in the coming year, members and input will be solicited for the technical committees, subcommittees, and task forces for the following related subjects: Radioactive Waste Management; a) Dry Storage of Irradiated Fuel; b) Short-Term Radioactive Waste Management; c) Long-Term Storage and Disposal of Radioactive Waste. 2. Decommissioning Nuclear Power is highly regulated, and public scrutiny has focused Codes and Standards on public and worker safety. Licensing and regulation serves to control

  3. MISTRA facility for containment lumped parameter and CFD codes validation. Example of the International Standard Problem ISP47

    International Nuclear Information System (INIS)

    Tkatschenko, I.; Studer, E.; Paillere, H.

    2005-01-01

    During a severe accident in a Pressurized Water Reactor (PWR), the formation of a combustible gas mixture in the complex geometry of the reactor depends on the understanding of hydrogen production, the complex 3D thermal-hydraulics flow due to gas/steam injection, natural convection, heat transfer by condensation on walls and effect of mitigation devices. Numerical simulation of such flows may be performed either by Lumped Parameter (LP) or by Computational Fluid Dynamics (CFD) codes. Advantages and drawbacks of LP and CFD codes are well-known. LP codes are mainly developed for full size containment analysis but they need improvements, especially since they are not able to accurately predict the local gas mixing within the containment. CFD codes require a process of validation on well-instrumented experimental data before they can be used with a high degree of confidence. The MISTRA coupled effect test facility has been built at CEA to fulfil this validation objective: with numerous measurement points in the gaseous volume - temperature, gas concentration, velocity and turbulence - and with well controlled boundary conditions. As illustration of both experimental and simulation areas of this topic, a recent example in the use of MISTRA test data is presented for the case of the International Standard Problem ISP47. The proposed experimental work in the MISTRA facility provides essential data to fill the gaps in the modelling/validation of computational tools. (author)

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Petrie, L.M.; Jordon, W.C. [Oak Ridge National Lab., TN (United States); Edwards, A.L. [Oak Ridge National Lab., TN (United States)]|[Lawrence Livermore National Lab., CA (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries.

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Landers, N.F.; Petrie, L.M.; Knight, J.R. [Oak Ridge National Lab., TN (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries.

  6. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    International Nuclear Information System (INIS)

    Petrie, L.M.; Jordon, W.C.; Edwards, A.L.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries

  7. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    International Nuclear Information System (INIS)

    Landers, N.F.; Petrie, L.M.; Knight, J.R.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries

  8. What expects of academies. Activity of code and standard Div. in JSME, AESJ and JEA

    International Nuclear Information System (INIS)

    Miyano, Hiroshi

    2005-01-01

    The present states of establishment activities of standards by the three academies in the title are summarized from the viewpoint of 'explanation accountability'. Their roles and directions to go forward are also summarized. To be concrete, the trends of these establishment academies of standards relating to the nuclear industry are summarized. By focusing on the three indispensable conditions (neutrality, impartiality and transparency) for the establishment academy of standards and their implementation, the verifications of these conditions and 'explanation accountability' in these three academies are described. (K. Kato)

  9. Czech Republic; Report on Observance of Standards and Codes-Fiscal Transparency Module-Update

    OpenAIRE

    International Monetary Fund

    2003-01-01

    The Czech government has made further progress in improving fiscal transparency that was already high by international standards. The measures implemented to broaden the coverage of general government data have been commended. Improved reporting on fiscal risks, including those arising from contingent liabilities, has been welcomed. However, greater effort is needed to improve the public availability of fiscal data and to maintain regular tax expenditure reports. Ensuring appropriate standard...

  10. Physician involvement enhances coding accuracy to ensure national standards: an initiative to improve awareness among new junior trainees.

    Science.gov (United States)

    Nallasivan, S; Gillott, T; Kamath, S; Blow, L; Goddard, V

    2011-06-01

    Record Keeping Standards is a development led by the Royal College of Physicians of London (RCP) Health Informatics Unit and funded by the National Health Service (NHS) Connecting for Health. A supplementary report produced by the RCP makes a number of recommendations based on a study held at an acute hospital trust. We audited the medical notes and coding to assess the accuracy, documentation by the junior doctors and also to correlate our findings with the RCP audit. Northern Lincolnshire & Goole Hospitals NHS Foundation Trust has 114,000 'finished consultant episodes' per year. A total of 100 consecutive medical (50) and rheumatology (50) discharges from Diana Princess of Wales Hospital from August-October 2009 were reviewed. The results showed an improvement in coding accuracy (10% errors), comparable to the RCP audit but with 5% documentation errors. Physician involvement needs enhancing to improve the effectiveness and to ensure clinical safety.

  11. Novel base-pairing interactions at the tRNA wobble position crucial for accurate reading of the genetic code

    Science.gov (United States)

    Rozov, Alexey; Demeshkina, Natalia; Khusainov, Iskander; Westhof, Eric; Yusupov, Marat; Yusupova, Gulnara

    2016-01-01

    Posttranscriptional modifications at the wobble position of transfer RNAs play a substantial role in deciphering the degenerate genetic code on the ribosome. The number and variety of modifications suggest different mechanisms of action during messenger RNA decoding, of which only a few were described so far. Here, on the basis of several 70S ribosome complex X-ray structures, we demonstrate how Escherichia coli tRNALysUUU with hypermodified 5-methylaminomethyl-2-thiouridine (mnm5s2U) at the wobble position discriminates between cognate codons AAA and AAG, and near-cognate stop codon UAA or isoleucine codon AUA, with which it forms pyrimidine-pyrimidine mismatches. We show that mnm5s2U forms an unusual pair with guanosine at the wobble position that expands general knowledge on the degeneracy of the genetic code and specifies a powerful role of tRNA modifications in translation. Our models consolidate the translational fidelity mechanism proposed previously where the steric complementarity and shape acceptance dominate the decoding mechanism.

  12. Innovation of genetic algorithm code GenA for WWER fuel loading optimization

    International Nuclear Information System (INIS)

    Sustek, J.

    2005-01-01

    One of the stochastic search techniques - genetic algorithms - was recently used for optimization of arrangement of fuel assemblies (FA) in core of reactors WWER-440 and WWER-1000. Basic algorithm was modified by incorporation of SPEA scheme. Both were enhanced and some results are presented (Authors)

  13. Common and Rare Coding Genetic Variation Underlying the Electrocardiographic PR Interval

    DEFF Research Database (Denmark)

    Lin, Honghuang; van Setten, Jessica; Smith, Albert V

    2018-01-01

    BACKGROUND: Electrical conduction from the cardiac sinoatrial node to the ventricles is critical for normal heart function. Genome-wide association studies have identified more than a dozen common genetic loci that are associated with PR interval. However, it is unclear whether rare and low-frequ...

  14. Edible safety requirements and assessment standards for agricultural genetically modified organisms.

    Science.gov (United States)

    Deng, Pingjian; Zhou, Xiangyang; Zhou, Peng; Du, Zhong; Hou, Hongli; Yang, Dongyan; Tan, Jianjun; Wu, Xiaojin; Zhang, Jinzhou; Yang, Yongcun; Liu, Jin; Liu, Guihua; Li, Yonghong; Liu, Jianjun; Yu, Lei; Fang, Shisong; Yang, Xiaoke

    2008-05-01

    This paper describes the background, principles, concepts and methods of framing the technical regulation for edible safety requirement and assessment of agricultural genetically modified organisms (agri-GMOs) for Shenzhen Special Economic Zone in the People's Republic of China. It provides a set of systematic criteria for edible safety requirements and the assessment process for agri-GMOs. First, focusing on the degree of risk and impact of different agri-GMOs, we developed hazard grades for toxicity, allergenicity, anti-nutrition effects, and unintended effects and standards for the impact type of genetic manipulation. Second, for assessing edible safety, we developed indexes and standards for different hazard grades of recipient organisms, for the influence of types of genetic manipulation and hazard grades of agri-GMOs. To evaluate the applicability of these criteria and their congruency with other safety assessment systems for GMOs applied by related organizations all over the world, we selected some agri-GMOs (soybean, maize, potato, capsicum and yeast) as cases to put through our new assessment system, and compared our results with the previous assessments. It turned out that the result of each of the cases was congruent with the original assessment.

  15. Preimplantation genetic diagnosis: International standards and the law of the republic of Serbia

    Directory of Open Access Journals (Sweden)

    Rajić Nataša

    2014-01-01

    Full Text Available The process of biomedical assisted reproduction, in addition to the treatment of infertility, also can be implemented for the purpose of prevention of transmission of serious hereditary disease to offspring. This is possible thanks to the preimplantation genetic diagnosis, which involves genetic testing of a few cells of the embryo in the early stage of development before implantation in a woman's body, and its elimination in the case of determining the genetic anomaly. The process of the preimplantation genetic diagnosis faces several constitutional values and raises a series of questions. Some of them were answered by European Court of Human Rights in the case Costa and Pavan v. Italiy. The subject of the paper is the analysis of this decision, which is important from a constitutional point of view, because it establishes guidelines for the interpretation of rules of domestic law. The second task of the paper is the analysis of normative solutions of our legal system in this area, in order to test their compliance with the standards set in this Court's decision.

  16. Contribution to the panel discussion on 'International developments in standards, rules and codes for pressure vessels'

    International Nuclear Information System (INIS)

    Puell, K.

    1992-01-01

    This contribution to the discussion describes the legal system for technical installations in its existing form in the Federal Republic of Germany. The standardization of technical requirements to meet EC Directives and European Standards requires an adjustment, to a limited extent, of the appurtenant legal prerequisites. Given the trend away from tried and tested control mechanisms in the form of third party inspection, there is imminent danger of a reduction in quality. This compels us to consider how to maintain nonetheless the level of safety that has already been reached. (orig.)

  17. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  18. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    2000-04-01

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  19. An RNA Phage Lab: MS2 in Walter Fiers' laboratory of molecular biology in Ghent, from genetic code to gene and genome, 1963-1976.

    Science.gov (United States)

    Pierrel, Jérôme

    2012-01-01

    The importance of viruses as model organisms is well-established in molecular biology and Max Delbrück's phage group set standards in the DNA phage field. In this paper, I argue that RNA phages, discovered in the 1960s, were also instrumental in the making of molecular biology. As part of experimental systems, RNA phages stood for messenger RNA (mRNA), genes and genome. RNA was thought to mediate information transfers between DNA and proteins. Furthermore, RNA was more manageable at the bench than DNA due to the availability of specific RNases, enzymes used as chemical tools to analyse RNA. Finally, RNA phages provided scientists with a pure source of mRNA to investigate the genetic code, genes and even a genome sequence. This paper focuses on Walter Fiers' laboratory at Ghent University (Belgium) and their work on the RNA phage MS2. When setting up his Laboratory of Molecular Biology, Fiers planned a comprehensive study of the virus with a strong emphasis on the issue of structure. In his lab, RNA sequencing, now a little-known technique, evolved gradually from a means to solve the genetic code, to a tool for completing the first genome sequence. Thus, I follow the research pathway of Fiers and his 'RNA phage lab' with their evolving experimental system from 1960 to the late 1970s. This study illuminates two decisive shifts in post-war biology: the emergence of molecular biology as a discipline in the 1960s in Europe and of genomics in the 1990s.

  20. The development of speech coding and the first standard coder for public mobile telephony

    NARCIS (Netherlands)

    Sluijter, R.J.

    2005-01-01

    This thesis describes in its core chapter (Chapter 4) the original algorithmic and design features of the ??rst coder for public mobile telephony, the GSM full-rate speech coder, as standardized in 1988. It has never been described in so much detail as presented here. The coder is put in a

  1. Corps et culture: les codes de savoir-vivre (Body and Culture: The Standards of Etiquette).

    Science.gov (United States)

    Picard, Dominique

    1983-01-01

    The evolution of values and standards of behavior as they relate to the body in culture are examined, especially in light of recent trends toward recognition of the natural and the spontaneous, the positive value placed on sexuality, and at the same time, narcissism and emphasis on youth. (MSE)

  2. Automated Facial Coding Software Outperforms People in Recognizing Neutral Faces as Neutral from Standardized Datasets

    Directory of Open Access Journals (Sweden)

    Peter eLewinski

    2015-09-01

    Full Text Available Little is known about people’s accuracy of recognizing neutral faces as neutral. In this paper, I demonstrate the importance of knowing how well people recognize neutral faces. I contrasted human recognition scores of 100 typical, neutral front-up facial images with scores of an arguably objective judge – automated facial coding (AFC software. I hypothesized that the software would outperform humans in recognizing neutral faces because of the inherently objective nature of computer algorithms. Results confirmed this hypothesis. I provided the first-ever evidence that computer software (90% was more accurate in recognizing neutral faces than people were (59%. I posited two theoretical mechanisms, i.e. smile-as-a-baseline and false recognition of emotion, as possible explanations for my findings.

  3. The safety relief valve handbook design and use of process safety valves to ASME and International codes and standards

    CERN Document Server

    Hellemans, Marc

    2009-01-01

    The Safety Valve Handbook is a professional reference for design, process, instrumentation, plant and maintenance engineers who work with fluid flow and transportation systems in the process industries, which covers the chemical, oil and gas, water, paper and pulp, food and bio products and energy sectors. It meets the need of engineers who have responsibilities for specifying, installing, inspecting or maintaining safety valves and flow control systems. It will also be an important reference for process safety and loss prevention engineers, environmental engineers, and plant and process designers who need to understand the operation of safety valves in a wider equipment or plant design context. . No other publication is dedicated to safety valves or to the extensive codes and standards that govern their installation and use. A single source means users save time in searching for specific information about safety valves. . The Safety Valve Handbook contains all of the vital technical and standards informat...

  4. Benefit using reasonable regulations in USA, how to skill up on professional engineers, apply international code, standard, and regulation

    International Nuclear Information System (INIS)

    Turner, S.L.; Morokuzu, Muneo; Amano, Osamu

    2005-01-01

    The reasonable regulations in USA consist of a graduated approach and a risk informed approach (RIA). RIA rationalizes the regulations on the basis of data of operations etc. PSA (Probabilistic Safety Assessment), a general method of RIA, is explained in detail. The benefits of nuclear power plant using RIA are increase of the rate of operation, visualization of risk, application of design standard and design, cost down of nuclear fuel cycle, waste, production and operation, and safety. RIA is supported by the field data, code, standard, regulation and professional engineers. The effects of introduction of RIA are explained. In order to introduce RIA in Japan, all the parties concerned such as the regulation authorities, the electric power industries, makers, universities, have to understand it and work together. A part of scientific society is stated. (S.Y.)

  5. Calculation of low-cycle fatigue in accordance with the national standard and strength codes

    Science.gov (United States)

    Kontorovich, T. S.; Radin, Yu. A.

    2017-08-01

    Over the most recent 15 years, the Russian power industry has largely relied on imported equipment manufactured in compliance with foreign standards and procedures. This inevitably necessitates their harmonization with the regulatory documents of the Russian Federation, which include calculations of strength, low cycle fatigue, and assessment of the equipment service life. An important regulatory document providing the engineering foundation for cyclic strength and life assessment for high-load components of the boiler and steamline of a water/steam circuit is RD 10-249-98:2000: Standard Method of Strength Estimation in Stationary Boilers and Steam and Water Piping. In January 2015, the National Standard of the Russian Federation 12952-3:2001 was introduced regulating the issues of design and calculation of the pressure parts of water-tube boilers and auxiliary installations. Thus, there appeared to be two documents simultaneously valid in the same energy field and using different methods for calculating the low-cycle fatigue strength, which leads to different results. In this connection, the current situation can lead to incorrect ideas about the cyclic strength and the service life of high-temperature boiler parts. The article shows that the results of calculations performed in accordance with GOST R 55682.3-2013/EN 12952-3: 2001 are less conservative than the results of the standard RD 10-249-98. Since the calculation of the expected service life of boiler parts should use GOST R 55682.3-2013/EN 12952-3: 2001, it becomes necessary to establish the applicability scope of each of the above documents.

  6. GoldenBraid: An Iterative Cloning System for Standardized Assembly of Reusable Genetic Modules

    Science.gov (United States)

    Sarrion-Perdigones, Alejandro; Falconi, Erica Elvira; Zandalinas, Sara I.; Juárez, Paloma; Fernández-del-Carmen, Asun; Granell, Antonio; Orzaez, Diego

    2011-01-01

    Synthetic Biology requires efficient and versatile DNA assembly systems to facilitate the building of new genetic modules/pathways from basic DNA parts in a standardized way. Here we present GoldenBraid (GB), a standardized assembly system based on type IIS restriction enzymes that allows the indefinite growth of reusable gene modules made of standardized DNA pieces. The GB system consists of a set of four destination plasmids (pDGBs) designed to incorporate multipartite assemblies made of standard DNA parts and to combine them binarily to build increasingly complex multigene constructs. The relative position of type IIS restriction sites inside pDGB vectors introduces a double loop (“braid”) topology in the cloning strategy that allows the indefinite growth of composite parts through the succession of iterative assembling steps, while the overall simplicity of the system is maintained. We propose the use of GoldenBraid as an assembly standard for Plant Synthetic Biology. For this purpose we have GB-adapted a set of binary plasmids for A. tumefaciens-mediated plant transformation. Fast GB-engineering of several multigene T-DNAs, including two alternative modules made of five reusable devices each, and comprising a total of 19 basic parts are also described. PMID:21750718

  7. Dynamics of genetic variation at gliadin-coding loci in bread wheat cultivars developed in small grains research center (Kragujevac during last 35 years

    Directory of Open Access Journals (Sweden)

    Novosljska-Dragovič Aleksandra

    2005-01-01

    Full Text Available Multiple alleles of gliadin-coding loci are well-known genetic markers of common wheat genotypes. Based on analysis of gliadin patterns in common wheat cultivars developed at the Small Grains Research Center in Kragujevac dynamics of genetic variability at gliadin-coding loci has been surveyed for the period of 35 years. It was shown that long-term breeding of the wheat cultivars involved gradual replacement of ancient alleles for those widely spread in some regions in the world, which belong to well-known cultivars-donor of some important traits. Developing cultivars whose pedigree involved much new foreign genetic material has increased genetic diversity as well as has changed frequency of alleles of gliadin-coding loci. So we can conclude that the genetic profile of modern Serbian cultivars has changed considerably. Genetic formula of gliadin was made for each the cultivar studied. The most frequent alleles of gliadin-coding loci among modern cultivars should be of great interest of breeders because these alleles are probably linked with genes that confer advantage to their carriers at present.

  8. Overview of the U.S. DOE Hydrogen Safety, Codes and Standards Program. Part 4: Hydrogen Sensors; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J.; Rivkin, Carl; Burgess, Robert; Brosha, Eric; Mukundan, Rangachary; James, C. Will; Keller, Jay

    2016-12-01

    Hydrogen sensors are recognized as a critical element in the safety design for any hydrogen system. In this role, sensors can perform several important functions including indication of unintended hydrogen releases, activation of mitigation strategies to preclude the development of dangerous situations, activation of alarm systems and communication to first responders, and to initiate system shutdown. The functionality of hydrogen sensors in this capacity is decoupled from the system being monitored, thereby providing an independent safety component that is not affected by the system itself. The importance of hydrogen sensors has been recognized by DOE and by the Fuel Cell Technologies Office's Safety and Codes Standards (SCS) program in particular, which has for several years supported hydrogen safety sensor research and development. The SCS hydrogen sensor programs are currently led by the National Renewable Energy Laboratory, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory. The current SCS sensor program encompasses the full range of issues related to safety sensors, including development of advance sensor platforms with exemplary performance, development of sensor-related code and standards, outreach to stakeholders on the role sensors play in facilitating deployment, technology evaluation, and support on the proper selection and use of sensors.

  9. Guided waves dispersion equations for orthotropic multilayered pipes solved using standard finite elements code.

    Science.gov (United States)

    Predoi, Mihai Valentin

    2014-09-01

    The dispersion curves for hollow multilayered cylinders are prerequisites in any practical guided waves application on such structures. The equations for homogeneous isotropic materials have been established more than 120 years ago. The difficulties in finding numerical solutions to analytic expressions remain considerable, especially if the materials are orthotropic visco-elastic as in the composites used for pipes in the last decades. Among other numerical techniques, the semi-analytical finite elements method has proven its capability of solving this problem. Two possibilities exist to model a finite elements eigenvalue problem: a two-dimensional cross-section model of the pipe or a radial segment model, intersecting the layers between the inner and the outer radius of the pipe. The last possibility is here adopted and distinct differential problems are deduced for longitudinal L(0,n), torsional T(0,n) and flexural F(m,n) modes. Eigenvalue problems are deduced for the three modes classes, offering explicit forms of each coefficient for the matrices used in an available general purpose finite elements code. Comparisons with existing solutions for pipes filled with non-linear viscoelastic fluid or visco-elastic coatings as well as for a fully orthotropic hollow cylinder are all proving the reliability and ease of use of this method. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. New Standard Evaluated Neutron Cross Section Libraries for the GEANT4 Code and First Verification

    CERN Document Server

    Mendoza, Emilio; Koi, Tatsumi; Guerrero, Carlos

    2014-01-01

    The Monte Carlo simulation of the interaction of neutrons with matter relies on evaluated nuclear data libraries and models. The evaluated libraries are compilations of measured physical parameters (such as cross sections) combined with predictions of nuclear model calculations which have been adjusted to reproduce the experimental data. The results obtained from the simulations depend largely on the accuracy of the underlying nuclear data used, and thus it is important to have access to the nuclear data libraries available, either of general use or compiled for specific applications, and to perform exhaustive validations which cover the wide scope of application of the simulation code. In this paper we describe the work performed in order to extend the capabilities of the GEANT4 toolkit for the simulation of the interaction of neutrons with matter at neutron energies up to 20 MeV and a first verification of the results obtained. Such a work is of relevance for applications as diverse as the simulation of a n...

  11. Theoretical Characterization of the H-Bonding and Stacking Potential of Two Non-Standard Nucleobases Expanding the Genetic Alphabet

    KAUST Repository

    Chawla, Mohit; Credendino, Raffaele; Chermak, Edrisse; Oliva, Romina; Cavallo, Luigi

    2016-01-01

    :C base pair, has been introduced in DNA molecules for expanding the genetic code. Our results indicate that the Z:P base pair closely mimics the G:C base pair both in terms of structure and stability. To clarify the role of the NO2 group on the C5

  12. Subjective Video Quality Assessment in H.264/AVC Video Coding Standard

    Directory of Open Access Journals (Sweden)

    Z. Miličević

    2012-11-01

    Full Text Available This paper seeks to provide an approach for subjective video quality assessment in the H.264/AVC standard. For this purpose a special software program for the subjective assessment of quality of all the tested video sequences is developed. It was developed in accordance with recommendation ITU-T P.910, since it is suitable for the testing of multimedia applications. The obtained results show that in the proposed selective intra prediction and optimized inter prediction algorithm there is a small difference in picture quality (signal-to-noise ratio between decoded original and modified video sequences.

  13. Discovery of coding genetic variants influencing diabetes-related serum biomarkers and their impact on risk of type 2 diabetes

    DEFF Research Database (Denmark)

    Ahluwalia, Tarun Veer Singh; Allin, Kristine Højgaard; Sandholt, Camilla Helene

    2015-01-01

    CONTEXT: Type 2 diabetes (T2D) prevalence is spiraling globally, and knowledge of its pathophysiological signatures is crucial for a better understanding and treatment of the disease. OBJECTIVE: We aimed to discover underlying coding genetic variants influencing fasting serum levels of nine......-nucleotide polymorphisms and were tested for association with each biomarker. Identified loci were tested for association with T2D through a large-scale meta-analysis involving up to 17 024 T2D cases and up to 64 186 controls. RESULTS: We discovered 11 associations between single-nucleotide polymorphisms and five distinct......, of which the association with the CELSR2 locus has not been shown previously. CONCLUSION: The identified loci influence processes related to insulin signaling, cell communication, immune function, apoptosis, DNA repair, and oxidative stress, all of which could provide a rationale for novel diabetes...

  14. Numeral series hidden in the distribution of atomic mass of amino acids to codon domains in the genetic code.

    Science.gov (United States)

    Wohlin, Åsa

    2015-03-21

    The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  15. Aminotryptophan-containing barstar: structure--function tradeoff in protein design and engineering with an expanded genetic code.

    Science.gov (United States)

    Rubini, Marina; Lepthien, Sandra; Golbik, Ralph; Budisa, Nediljko

    2006-07-01

    The indole ring of the canonical amino acid tryptophan (Trp) possesses distinguished features, such as sterical bulk, hydrophobicity and the nitrogen atom which is capable of acting as a hydrogen bond donor. The introduction of an amino group into the indole moiety of Trp yields the structural analogs 4-aminotryptophan ((4-NH(2))Trp) and 5-aminotryptophan ((5-NH(2))Trp). Their hydrophobicity and spectral properties are substantially different when compared to those of Trp. They resemble the purine bases of DNA and share their capacity for pH-sensitive intramolecular charge transfer. The Trp --> aminotryptophan substitution in proteins during ribosomal translation is expected to result in related protein variants that acquire these features. These expectations have been fulfilled by incorporating (4-NH(2))Trp and (5-NH(2))Trp into barstar, an intracellular inhibitor of the ribonuclease barnase from Bacillus amyloliquefaciens. The crystal structure of (4-NH(2))Trp-barstar is similar to that of the parent protein, whereas its spectral and thermodynamic behavior is found to be remarkably different. The T(m) value of (4-NH(2))Trp- and (5-NH(2))Trp-barstar is lowered by about 20 degrees Celsius, and they exhibit a strongly reduced unfolding cooperativity and substantial loss of free energy in folding. Furthermore, folding kinetic study of (4-NH(2))Trp-barstar revealed that the denatured state is even preferred over native one. The combination of structural and thermodynamic analyses clearly shows how structures of substituted barstar display a typical structure-function tradeoff: the acquirement of unique pH-sensitive charge transfer as a novel function is achieved at the expense of protein stability. These findings provide a new insight into the evolution of the amino acid repertoire of the universal genetic code and highlight possible problems regarding protein engineering and design by using an expanded genetic code.

  16. Review of provisions on corrosion fatigue and stress corrosion in WWER and Western LWR Codes and Standards

    International Nuclear Information System (INIS)

    Buckthorpe, D.; Filatov, V.; Tashkinov, A.; Evropin, S.V.; Matocha, K.; Guinovart, J.

    2003-01-01

    Results are presented from a collaborative project performed on behalf of the European Commission, Working Group Codes and Standards. The work covered the contents of current codes and standards, plant experience and R and D results. Current fatigue design rules use S-N curves based on tests in air. Although WWER and LWR design curves are often similar they are derived, presented and used in different ways and it is neither convenient nor appropriate to harmonise them. Similarly the fatigue crack growth laws used in the various design and in-service inspection rules differ significantly with respect to both growth rates in air and the effects of water reactor environments. Harmonised approaches to the effects of WWER and LWR environments are possible based on results from R and D programmes carried out over the last decade. For carbon and low alloy steels a consistent approach to both crack initiation and growth can be formulated based on the superposition of environmentally assisted cracking effects on the fatigue crack development. The approach indicates that effects of the water environment are minimal given appropriate control of the oxygen content of the water and/or the sulphur content of the steel. For austenitic stainless steels a different mechanisms may apply and a harmonised approach is possible at present only for S-N curves. Although substantial progress has been made with respect to corrosion fatigue, more data and a clearer understanding are required in order to write code provisions particularly in the area of high cycle fatigue. Reactor operation experience shows stress corrosion cracking of austenitic steels is the most common cause of failure. These failures are associated with high residual stresses combined with high levels of dissolved oxygen or the presence of contaminants. For primary circuit internals there is a potential threat to integrity from irradiated assisted stress corrosion cracking. Design and in-service inspection rules do not at

  17. Origin of an alternative genetic code in the extremely small and GC-rich genome of a bacterial symbiont.

    Directory of Open Access Journals (Sweden)

    John P McCutcheon

    2009-07-01

    Full Text Available The genetic code relates nucleotide sequence to amino acid sequence and is shared across all organisms, with the rare exceptions of lineages in which one or a few codons have acquired novel assignments. Recoding of UGA from stop to tryptophan has evolved independently in certain reduced bacterial genomes, including those of the mycoplasmas and some mitochondria. Small genomes typically exhibit low guanine plus cytosine (GC content, and this bias in base composition has been proposed to drive UGA Stop to Tryptophan (Stop-->Trp recoding. Using a combination of genome sequencing and high-throughput proteomics, we show that an alpha-Proteobacterial symbiont of cicadas has the unprecedented combination of an extremely small genome (144 kb, a GC-biased base composition (58.4%, and a coding reassignment of UGA Stop-->Trp. Although it is not clear why this tiny genome lacks the low GC content typical of other small bacterial genomes, these observations support a role of genome reduction rather than base composition as a driver of codon reassignment.

  18. MDEP Technical Report TR-CSWG-01. Technical Report: Regulatory Frameworks for the Use of Nuclear Pressure Boundary Codes and Standards in MDEP Countries

    International Nuclear Information System (INIS)

    2013-01-01

    The Codes and Standards Working Group (CSWG) is one of the issue-specific working groups that the MDEP members are undertaking; its long term goal is harmonisation of regulatory and code requirements for design and construction of pressure-retaining components in order to improve the effectiveness and efficiency of the regulatory design reviews, increase quality of safety assessments, and to enable each regulator to become stronger in its ability to make safety decisions. The CSWG has interacted closely with the Standards Development Organisations (SDOs) and CORDEL in code comparison and code convergence. The Code Comparison Report STP-NU-051 has been issued by SDO members to identify the extent of similarities and differences amongst the pressure-boundary codes and standards used in various countries. Besides the differences in codes and standards, the way how the codes and standards are applied to systems, structures and components also affects the design and construction of nuclear power plant. Therefore, to accomplish the goal of potential harmonisation, it is also vital that the regulators learn about each other's procedures, processes, and regulations. To facilitate the learning process, the CSWG meets regularly to discuss issues relevant to licensing new reactors and using codes and standards in licensing safety reviews. The CSWG communicates very frequently with the SDOs to discuss similarities and differences among the various codes and how to proceed with potential harmonisation. It should be noted that the IAEA is invited to all of the issue-specific working groups within MDEP to ensure consistency with IAEA standards. The primary focus of this technical report is to consolidate information shared and accomplishments achieved by the member countries. This report seeks to document how each MDEP regulator utilises national or regional mechanical codes and standards in its safety reviews and licensing of new reactors. The preparation of this report

  19. Validation of the ASSERT subchannel code: Prediction of critical heat flux in standard and nonstandard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1995-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of Canada uranium deuterium (CANDU) pressurized heavy water reactor fuel channels and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting CHF at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental database. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. The numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology are discussed. The evolutionary validation plan is also discussed and early validation exercises are summarized. More recent validation exercises in standard and nonstandard geometries are emphasized

  20. The Authoritarian Personality in Emerging Adulthood: Longitudinal Analysis Using Standardized Scales, Observer Ratings, and Content Coding of the Life Story.

    Science.gov (United States)

    Peterson, Bill E; Pratt, Michael W; Olsen, Janelle R; Alisat, Susan

    2016-04-01

    Three different methods (a standardized scale, an observer-based Q-sort, and content coding of narratives) were used to study the continuity of authoritarianism longitudinally in emerging and young adults. Authoritarianism was assessed in a Canadian sample (N = 92) of men and women at ages 19 and 32 with Altemeyer's (1996) Right-Wing Authoritarianism (RWA) Scale. In addition, components of the authoritarian personality were assessed at age 26 through Q-sort observer methods (Block, 2008) and at age 32 through content coding of life stories. Age 19 authoritarianism predicted the Q-sort and life story measures of authoritarianism. Two hierarchical regression analyses showed that the Q-sort and life story measures of authoritarianism also predicted the RWA scale at age 32 beyond educational level and parental status, and even after the inclusion of age 19 RWA. Differences and similarities in the pattern of correlates for the Q-sort and life story measures are discussed, including the overall lack of results for authoritarian aggression. Content in narratives may be the result of emerging adult authoritarianism and may serve to maintain levels of authoritarianism in young adulthood. © 2014 Wiley Periodicals, Inc.

  1. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  2. Stochastic optimization of GeantV code by use of genetic algorithms

    Science.gov (United States)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) and handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. The goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.

  3. AECL international standard problem ISP-41 FU/1 follow-up exercise (Phase 1): Containment Iodine Computer Code Exercise: Parametric Studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-06-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I- concentration. The codes used in this exercise were IODE (IPSN), IODE (NRIR), IMPAIR (GRS), INSPECT (AEAT), IMOD (AECL) and LIRIC (AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained from intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (authors)

  4. Preparation of a standardized, efficacious agricultural H5N3 vaccine by reverse genetics

    International Nuclear Information System (INIS)

    Liu Ming; Wood, John M.; Ellis, Trevor; Krauss, Scott; Seiler, Patrick; Johnson, Christie; Hoffmann, Erich; Humberd, Jennifer; Hulse, Diane; Zhang Yun; Webster, Robert G.; Perez, Daniel R.

    2003-01-01

    Options for the control of emerging and reemerging H5N1 influenza viruses include improvements in biosecurity and the use of inactivated vaccines. Commercially available H5N2 influenza vaccine prevents disease signs and reduces virus load but does not completely prevent virus shedding after challenge with H5N1 virus. By using reverse genetics, we prepared an H5N3 vaccine whose hemagglutinin is 99.6% homologous to that of A/CK/HK/86.3/02 (H5N1). We used the internal genes of A/PR/8/34 and the H5 of A/Goose/HK/437.4/99 (H5N1) after deletion of basic amino acids from its connecting peptide region. The resulting virus was not lethal to chicken embryos and grew to high HA titers in eggs, allowing preparation of HA protein-standardized vaccine in unconcentrated allantoic fluid. The N3 neuraminidase, derived from A/Duck/Germany/1215/73 (H2N3), permitted discrimination between vaccinated and naturally infected birds. The virus construct failed to replicate in quail and chickens. Similar to parental A/PR/8/34 (H1N1), it replicated in mice and ferrets and spread to the brains of mice; therefore, it should not be used as a live-attenuated vaccine. The H5N3 vaccine, at doses of 1.2 μg HA, induced HI antibodies in chickens and prevented death, signs of disease, and markedly reduced virus shedding after challenge with A/CK/HK/86.3/02 (H5N1) but did not provide sterilizing immunity. Thus, reverse genetics allows the inexpensive preparation of standardized, efficacious H5N3 poultry vaccines that may also reduce the reemergence of H5N1 genotypes

  5. Changing priorities of codes and standards: An A/E's perspective for operating units and new generation

    International Nuclear Information System (INIS)

    Meyers, B.L.; Jackson, R.W.; Morowski, B.D.

    1994-01-01

    As the nuclear power industry has shifted emphasis from the construction of new plants to the reliability and maintenance of operating units, the industry's commitment to safety has been well guarded and maintained. Many other important indicators of nuclear industry performance are also positive. Unfortunately, by some projections, as many as 25 operating nuclear units could prematurely shutdown because of increasing O ampersand M and total operating costs. The immediate impact of higher generating costs on the nuclear industry is evident. However, when viewed over the longer-term, high generating costs will also affect license renewals, progress in the development of advanced light water reactor designs and prospects for a return to the building of new plants. Today's challenge is to leverage the expertise and contribution of the nuclear industry partner organizations to steadily improve the work processes and methods necessary to reduce operating costs, to achieve higher levels in the performance of operating units, and to maintain high standards of technical excellence and safety. From the experience and perspective of an A/E and partner in the nuclear industry, this paper will discuss the changing priorities of codes and standards as they relate to opportunities for the communication of lessons learned and improving the responsiveness to industry needs

  6. MDEP Technical Report TR-CSWG-02. Technical Report on Lessons Learnt on Achieving Harmonisation of Codes and Standards for Pressure Boundary Components in Nuclear Power Plants

    International Nuclear Information System (INIS)

    2013-01-01

    This report was prepared by the Multinational Design Evaluation Program's (MDEP's) Codes and Standards Working Group (CSWG). The primary, long-term goal of MDEP's CSWG is to achieve international harmonisation of codes and standards for pressure-boundary components in nuclear power plants. The CSWG recognised early on that the first step to achieving harmonisation is to understand the extent of similarities and differences amongst the pressure-boundary codes and standards used in various countries. To assist the CSWG in its long-term goals, several standards developing organisations (SDOs) from various countries performed a comparison of their pressure-boundary codes and standards to identify the extent of similarities and differences in code requirements and the reasons for their differences. The results of the code-comparison project provided the CSWG with valuable insights in developing the subsequent actions to take with SDOs and the nuclear industry to pursue harmonisation of codes and standards. The results enabled the CSWG to understand from a global perspective how each country's pressure-boundary code or standard evolved into its current form and content. The CSWG recognised the important fact that each country's pressure-boundary code or standard is a comprehensive, living document that is continually being updated and improved to reflect changing technology and common industry practices unique to each country. The rules in the pressure-boundary codes and standards include comprehensive requirements for the design and construction of nuclear power plant components including design, materials selection, fabrication, examination, testing and overpressure protection. The rules also contain programmatic and administrative requirements such as quality assurance; conformity assessment (e.g., third-party inspection); qualification of welders, welding equipment and welding procedures; non-destructive examination (NDE) practices and

  7. Improved Transient Performance of a Fuzzy Modified Model Reference Adaptive Controller for an Interacting Coupled Tank System Using Real-Coded Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Asan Mohideen Khansadurai

    2014-01-01

    Full Text Available The main objective of the paper is to design a model reference adaptive controller (MRAC with improved transient performance. A modification to the standard direct MRAC called fuzzy modified MRAC (FMRAC is used in the paper. The FMRAC uses a proportional control based Mamdani-type fuzzy logic controller (MFLC to improve the transient performance of a direct MRAC. The paper proposes the application of real-coded genetic algorithm (RGA to tune the membership function parameters of the proposed FMRAC offline so that the transient performance of the FMRAC is improved further. In this study, a GA based modified MRAC (GAMMRAC, an FMRAC, and a GA based FMRAC (GAFMRAC are designed for a coupled tank setup in a hybrid tank process and their transient performances are compared. The results show that the proposed GAFMRAC gives a better transient performance than the GAMMRAC or the FMRAC. It is concluded that the proposed controller can be used to obtain very good transient performance for the control of nonlinear processes.

  8. Human growth hormone-related latrogenic Creutzfeldt-Jakob disease: Search for a genetic susceptibility by analysis of the PRNP coding region

    Energy Technology Data Exchange (ETDEWEB)

    Jaegly, A.; Boussin, F.; Deslys, J.P. [CEA/CRSSA/DSV/DPTE, Fontenay-aux-Roses (France)] [and others

    1995-05-20

    The human PRNP gene encoding PrP is located on chromosome 20 and consists of two exons and a single intron. The open reading frame is entirely fitted into the second exon. Genetic studies indicate that all of the familial and several sporadic forms of TSSEs are associated with mutations in the PRNP 759-bp coding region. Moreover, homozygosity at codon 129, a locus harboring a polymorphism among the general population, was proposed as a genetic susceptibility marker for both sporadic and iatrogenic CJD. To assess whether additional genetic predisposition markers exist in the PRNP gene, the authors sequenced the PRNP coding region of 17 of the 32 French patients who developed a hGH-related CJD.

  9. Nuclear codes and standards

    International Nuclear Information System (INIS)

    Raisic, N.

    1980-01-01

    The present paper deals with quality assurance regulations and analysesthe difference between documents that can be used in every country and applied on any industrial organizations and documents some aspects of which are bound to the American organization. (orig./RW)

  10. International standard problem (ISP) No. 43 Rapid boron-dilution transient tests for code verification. Comparison report

    International Nuclear Information System (INIS)

    2001-03-01

    International Standard Problem No. 43 (ISP 43) addresses the nuclear industries present capabilities of simulating fluid dynamics aspects of a subset of rapid boron dilution transients. Specifically, the exercise focuses on the sequence involving the transport of a boron-dilute slug through the actuation of a pump. The slug is formed on the primary side of the steam generator as a consequence of in interfacing system leak from the secondary un-borated coolant. Experimental data was collected using the University of Maryland 2 x 4 Thermalhydraulic Loop (UM 2 x 4 Loop) and the Boron-mixing Visualization Facility. Two blind test series were proposed during the first workshop (October 1998) and refined using participant input. The first series, test series A, deals with the injection of a front, i.e., a single interface between borated and dilute fluids. The second blind series, test series B, is the more realistic injection of a slug, i.e., a dilute fluid volume preceded and followed by the borated coolant of the primary system. Data are collected in the UM 2 x 4 Loop and refined details are obtained from the Visualization Facility, which represents a replica of the Loop.s vessel downcomer. In the Loop experimental program, the dilute volume is simulated by cold water and the borated primary coolant is simulated by hot water. The Visualization Facility uses dye to mark the diluted front or slug. The measured boundary conditions for both test series include the initial temperature of the primary system, the front/slug injection flowrate and temperature, and the pressure drop across the core. Temperature data is collected at 185 thermocouple positions in the downcomer and 38 positions in the lower plenum. The advancement of the front/slug through the system is monitored at discrete horizontal levels that contain the thermocouples. The performance of codes is measured relative to a set of figures of merit. During the first workshop, the principal figure of merit was

  11. Partitioning of genetic variation between regulatory and coding gene segments: the predominance of software variation in genes encoding introvert proteins.

    Science.gov (United States)

    Mitchison, A

    1997-01-01

    In considering genetic variation in eukaryotes, a fundamental distinction can be made between variation in regulatory (software) and coding (hardware) gene segments. For quantitative traits the bulk of variation, particularly that near the population mean, appears to reside in regulatory segments. The main exceptions to this rule concern proteins which handle extrinsic substances, here termed extrovert proteins. The immune system includes an unusually large proportion of this exceptional category, but even so its chief source of variation may well be polymorphism in regulatory gene segments. The main evidence for this view emerges from genome scanning for quantitative trait loci (QTL), which in the case of the immune system points to a major contribution of pro-inflammatory cytokine genes. Further support comes from sequencing of major histocompatibility complex (Mhc) class II promoters, where a high level of polymorphism has been detected. These Mhc promoters appear to act, in part at least, by gating the back-signal from T cells into antigen-presenting cells. Both these forms of polymorphism are likely to be sustained by the need for flexibility in the immune response. Future work on promoter polymorphism is likely to benefit from the input from genome informatics.

  12. Genetic Recombination Between Stromal and Cancer Cells Results in Highly Malignant Cells Identified by Color-Coded Imaging in a Mouse Lymphoma Model.

    Science.gov (United States)

    Nakamura, Miki; Suetsugu, Atsushi; Hasegawa, Kousuke; Matsumoto, Takuro; Aoki, Hitomi; Kunisada, Takahiro; Shimizu, Masahito; Saji, Shigetoyo; Moriwaki, Hisataka; Hoffman, Robert M

    2017-12-01

    The tumor microenvironment (TME) promotes tumor growth and metastasis. We previously established the color-coded EL4 lymphoma TME model with red fluorescent protein (RFP) expressing EL4 implanted in transgenic C57BL/6 green fluorescent protein (GFP) mice. Color-coded imaging of the lymphoma TME suggested an important role of stromal cells in lymphoma progression and metastasis. In the present study, we used color-coded imaging of RFP-lymphoma cells and GFP stromal cells to identify yellow-fluorescent genetically recombinant cells appearing only during metastasis. The EL4-RFP lymphoma cells were injected subcutaneously in C57BL/6-GFP transgenic mice and formed subcutaneous tumors 14 days after cell transplantation. The subcutaneous tumors were harvested and transplanted to the abdominal cavity of nude mice. Metastases to the liver, perigastric lymph node, ascites, bone marrow, and primary tumor were imaged. In addition to EL4-RFP cells and GFP-host cells, genetically recombinant yellow-fluorescent cells, were observed only in the ascites and bone marrow. These results indicate genetic exchange between the stromal and cancer cells. Possible mechanisms of genetic exchange are discussed as well as its ramifications for metastasis. J. Cell. Biochem. 118: 4216-4221, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Journalistic Ethics and Standards in the Spanish Constitution and National Codes of Conduct from the Perspective of Andalusian Journalism Students

    Directory of Open Access Journals (Sweden)

    María Ángeles López-Hernández

    2013-12-01

    Full Text Available This paper focuses on the opinion held by journalism students of the Faculty of Communication of Seville University about the ethical standards set out in the Spanish Constitution (Article 20.1.d and the country’s codes of conduct. The aim of this paper is to identify the ethical system of values of today’s university students, who have not yet been “contaminated” by the profession and on whom the future of journalism in Spain will ultimately depend. Although the results show that journalism students (both 1st year students and those in their final year have embraced a fairly solid ethical system of values, they nevertheless believe that the strong influence that economic and political powers currently exert on Spanish media corporations makes it impossible for journalists to cultivate their own work ethic, consequently obliging them to conform to the “unscrupulous” demands of their bosses. Faced with this reality, the authors reflect on the need to reinforce ethical values in the lecture hall as a way of curbing, as soon as possible, the deterioration of journalism that has been detected in Spain.

  14. A symmetry model for genetic coding via a wallpaper group composed of the traditional four bases and an imaginary base E: towards category theory-like systematization of molecular/genetic biology.

    Science.gov (United States)

    Sawamura, Jitsuki; Morishita, Shigeru; Ishigooka, Jun

    2014-05-07

    Previously, we suggested prototypal models that describe some clinical states based on group postulates. Here, we demonstrate a group/category theory-like model for molecular/genetic biology as an alternative application of our previous model. Specifically, we focus on deoxyribonucleic acid (DNA) base sequences. We construct a wallpaper pattern based on a five-letter cruciform motif with letters C, A, T, G, and E. Whereas the first four letters represent the standard DNA bases, the fifth is introduced for ease in formulating group operations that reproduce insertions and deletions of DNA base sequences. A basic group Z5 = {r, u, d, l, n} of operations is defined for the wallpaper pattern, with which a sequence of points can be generated corresponding to changes of a base in a DNA sequence by following the orbit of a point of the pattern under operations in group Z5. Other manipulations of DNA sequence can be treated using a vector-like notation 'Dj' corresponding to a DNA sequence but based on the five-letter base set; also, 'Dj's are expressed graphically. Insertions and deletions of a series of letters 'E' are admitted to assist in describing DNA recombination. Likewise, a vector-like notation Rj can be constructed for sequences of ribonucleic acid (RNA). The wallpaper group B = {Z5×∞, ●} (an ∞-fold Cartesian product of Z5) acts on Dj (or Rj) yielding changes to Dj (or Rj) denoted by 'Dj◦B(j→k) = Dk' (or 'Rj◦B(j→k) = Rk'). Based on the operations of this group, two types of groups-a modulo 5 linear group and a rotational group over the Gaussian plane, acting on the five bases-are linked as parts of the wallpaper group for broader applications. As a result, changes, insertions/deletions and DNA (RNA) recombination (partial/total conversion) are described. As an exploratory study, a notation for the canonical "central dogma" via a category theory-like way is presented for future developments. Despite the large incompleteness of our

  15. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Illinois Jurisdictions

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.; Friedrich, Michele

    2002-05-01

    ASHRAE Standard 90.1-1999 was developed in an effort to set minimum requirements for energy efficienty design and construction of new commercial buildings. This report assesses the benefits and costs of adopting this standard as the building energy code in Illinois. Energy and economic impacts are estimated using BLAST combined with a Life-Cycle Cost approach to assess corresponding economic costs and benefits.

  16. Cameroonian fruit bats harbor divergent viruses, including rotavirus H, bastroviruses, and picobirnaviruses using an alternative genetic code.

    Science.gov (United States)

    Yinda, Claude Kwe; Ghogomu, Stephen Mbigha; Conceição-Neto, Nádia; Beller, Leen; Deboutte, Ward; Vanhulle, Emiel; Maes, Piet; Van Ranst, Marc; Matthijnssens, Jelle

    2018-01-01

    Most human emerging infectious diseases originate from wildlife and bats are a major reservoir of viruses, a few of which have been highly pathogenic to humans. In some regions of Cameroon, bats are hunted and eaten as a delicacy. This close proximity between human and bats provides ample opportunity for zoonotic events. To elucidate the viral diversity of Cameroonian fruit bats, we collected and metagenomically screened eighty-seven fecal samples of Eidolon helvum and Epomophorus gambianus fruit bats. The results showed a plethora of known and novel viruses. Phylogenetic analyses of the eleven gene segments of the first complete bat rotavirus H genome, showed clearly separated clusters of human, porcine, and bat rotavirus H strains, not indicating any recent interspecies transmission events. Additionally, we identified and analyzed a bat bastrovirus genome (a novel group of recently described viruses, related to astroviruses and hepatitis E viruses), confirming their recombinant nature, and provide further evidence of additional recombination events among bat bastroviruses. Interestingly, picobirnavirus-like RNA-dependent RNA polymerase gene segments were identified using an alternative mitochondrial genetic code, and further principal component analyses suggested that they may have a similar lifestyle to mitoviruses, a group of virus-like elements known to infect the mitochondria of fungi. Although identified bat coronavirus, parvovirus, and cyclovirus strains belong to established genera, most of the identified partitiviruses and densoviruses constitute putative novel genera in their respective families. Finally, the results of the phage community analyses of these bats indicate a very diverse geographically distinct bat phage population, probably reflecting different diets and gut bacterial ecosystems.

  17. An efficient genetic algorithm for structural RNA pairwise alignment and its application to non-coding RNA discovery in yeast

    Directory of Open Access Journals (Sweden)

    Taneda Akito

    2008-12-01

    Full Text Available Abstract Background Aligning RNA sequences with low sequence identity has been a challenging problem since such a computation essentially needs an algorithm with high complexities for taking structural conservation into account. Although many sophisticated algorithms for the purpose have been proposed to date, further improvement in efficiency is necessary to accelerate its large-scale applications including non-coding RNA (ncRNA discovery. Results We developed a new genetic algorithm, Cofolga2, for simultaneously computing pairwise RNA sequence alignment and consensus folding, and benchmarked it using BRAliBase 2.1. The benchmark results showed that our new algorithm is accurate and efficient in both time and memory usage. Then, combining with the originally trained SVM, we applied the new algorithm to novel ncRNA discovery where we compared S. cerevisiae genome with six related genomes in a pairwise manner. By focusing our search to the relatively short regions (50 bp to 2,000 bp sandwiched by conserved sequences, we successfully predict 714 intergenic and 1,311 sense or antisense ncRNA candidates, which were found in the pairwise alignments with stable consensus secondary structure and low sequence identity (≤ 50%. By comparing with the previous predictions, we found that > 92% of the candidates is novel candidates. The estimated rate of false positives in the predicted candidates is 51%. Twenty-five percent of the intergenic candidates has supports for expression in cell, i.e. their genomic positions overlap those of the experimentally determined transcripts in literature. By manual inspection of the results, moreover, we obtained four multiple alignments with low sequence identity which reveal consensus structures shared by three species/sequences. Conclusion The present method gives an efficient tool complementary to sequence-alignment-based ncRNA finders.

  18. Simple, standardized incorporation of genetic risk into non-genetic risk prediction tools for complex traits: coronary heart disease as an example

    Directory of Open Access Journals (Sweden)

    Benjamin A Goldstein

    2014-08-01

    Full Text Available Purpose: Genetic risk assessment is becoming an important component of clinical decision-making. Genetic Risk Scores (GRSs allow the composite assessment of genetic risk in complex traits. A technically and clinically pertinent question is how to most easily and effectively combine a GRS with an assessment of clinical risk derived from established non-genetic risk factors as well as to clearly present this information to patient and health care providers. Materials & Methods: We illustrate a means to combine a GRS with an independent assessment of clinical risk using a log-link function. We apply the method to the prediction of coronary heart disease (CHD in the Atherosclerosis Risk in Communities (ARIC cohort. We evaluate different constructions based on metrics of effect change, discrimination, and calibration.Results: The addition of a GRS to a clinical risk score (CRS improves both discrimination and calibration for CHD in ARIC. Results are similar regardless of whether external vs. internal coefficients are used for the CRS, risk factor SNPs are included in the GRS, or subjects with diabetes at baseline are excluded. We outline how to report the construction and the performance of a GRS using our method and illustrate a means to present genetic risk information to subjects and/or their health care provider. Conclusion: The proposed method facilitates the standardized incorporation of a GRS in risk assessment.

  19. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  20. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  1. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.

  2. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-2001 as the Commercial Building Energy Code in Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Winiarski, David W.; Belzer, David B.; Richman, Eric E.

    2004-09-30

    ASHRAE Standard 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings (hereafter referred to as ASHRAE 90.1-2001 or 90.1-2001) was developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The State of Tennessee is considering adopting ASHRAE 90.1-2001 as its commercial building energy code. In an effort to evaluate whether or not this is an appropriate code for the state, the potential benefits and costs of adopting this standard are considered in this report. Both qualitative and quantitative benefits and costs are assessed. Energy and economic impacts are estimated using the Building Loads Analysis and System Thermodynamics (BLAST) simulations combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits. Tennessee currently has ASHRAE Standard 90A-1980 as the statewide voluntary/recommended commercial energy standard; however, it is up to the local jurisdiction to adopt this code. Because 90A-1980 is the recommended standard, many of the requirements of ASHRAE 90A-1980 were used as a baseline for simulations.

  3. Report on the Observance of Standards and Codes, Accounting and Auditing : Module B - Institutional Framework for Corporate Financial Reporting, B.9 Auditing Standard-setting

    OpenAIRE

    World Bank

    2017-01-01

    The purpose of this report is to gain an understanding of the governance arrangements, procedures, and capacity for setting auditing standards in a jurisdiction, covering: (a) the adoption of International Standards on Auditing (ISA) where applicable, and (b) national auditing standards. The questions are based on examples of good practice followed by international standard-setting bodies....

  4. Ethics Standards Impacting Test Development and Use: A Review of 31 Ethics Codes Impacting Practices in 35 Countries

    Science.gov (United States)

    Leach, Mark M.; Oakland, Thomas

    2007-01-01

    Ethics codes are designed to protect the public by prescribing behaviors professionals are expected to exhibit. Although test use is universal, albeit reflecting strong Western influences, previous studies that examine the degree issues pertaining to test development and use and that are addressed in ethics codes of national psychological…

  5. Application of NEA/CSNI standard problem 3 (blowdown and flow reversal in the IETA-1 rig) to the validation of the RELAP-UK Mk IV code

    International Nuclear Information System (INIS)

    Bryce, W.M.

    1977-10-01

    NEA/CSNI Standard Problem 3 consists of the modelling of an experiment on the IETI-1 rig, in which there is initially flow upwards through a feeder, heated section and riser. The inlet and outlet are then closed and a breach opened at the bottom so that the flow reverses and the rig depressurises. Calculations of this problem by many countries using several computer codes have been reported and show a wide spread of results. The purpose of the study reported here was the following. First, to show the sensitivity of the calculation of Standard Problem 3. Second, to perform an ab initio best estimate calculation using the RELAP-UK Mark IV code with the standard recommended options, and third, to use the results of the sensitivity study to show where tuning of the RELAP-UK Mark IV recommended model options was required. This study has shown that the calculation of Standard Problem 3 is sensitive to model assumptions and that the use of the loss-of-coolant accident code RELAP-UK Mk IV with the standard recommended model options predicts the experimental results very well over most of the transient. (U.K.)

  6. Test Anxiety and a High-Stakes Standardized Reading Comprehension Test: A Behavioral Genetics Perspective

    Science.gov (United States)

    Wood, Sarah G.; Hart, Sara A.; Little, Callie W.; Phillips, Beth M.

    2016-01-01

    Past research suggests that reading comprehension test performance does not rely solely on targeted cognitive processes such as word reading, but also on other nontarget aspects such as test anxiety. Using a genetically sensitive design, we sought to understand the genetic and environmental etiology of the association between test anxiety and…

  7. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    Science.gov (United States)

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  8. Revision of Ethical Standard 3.04 of the "Ethical Principles of Psychologists and Code of Conduct" (2002, as amended 2010).

    Science.gov (United States)

    2016-12-01

    The following amendment to Ethical Standard 3.04 of the 2002 "Ethical Principles of Psychologists and Code of Conduct" as amended, 2010 (the Ethics Code; American Psychological Association, 2002, 2010) was adopted by the APA Council of Representatives at its August 2016 meeting. The amendment will become effective January 1, 2017. Following is an explanation of the change, a clean version of the revision, and a version indicating changes from the 2002 language (inserted text is in italics). (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Michigan

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Belzer, David B.; Halverson, Mark A.; Richman, Eric E.; Winiarski, David W.

    2002-09-30

    The state of Michigan is considering adpoting ASHRAE 90.1-1999 as its commercial building energy code. In an effort to evaluate whether or not this is an appropraite code for the state, the potential benefits and costs of adopting this standard are considered. Both qualitative and quantitative benefits are assessed. The energy simulation and economic results suggest that adopting ASHRAE 90.1-1999 would provide postitive net benefits to the state relative to the building and design requirements currently in place.

  10. Instance-based Policy Learning by Real-coded Genetic Algorithms and Its Application to Control of Nonholonomic Systems

    Science.gov (United States)

    Miyamae, Atsushi; Sakuma, Jun; Ono, Isao; Kobayashi, Shigenobu

    The stabilization control of nonholonomic systems have been extensively studied because it is essential for nonholonomic robot control problems. The difficulty in this problem is that the theoretical derivation of control policy is not necessarily guaranteed achievable. In this paper, we present a reinforcement learning (RL) method with instance-based policy (IBP) representation, in which control policies for this class are optimized with respect to user-defined cost functions. Direct policy search (DPS) is an approach for RL; the policy is represented by parametric models and the model parameters are directly searched by optimization techniques including genetic algorithms (GAs). In IBP representation an instance consists of a state and an action pair; a policy consists of a set of instances. Several DPSs with IBP have been previously proposed. In these methods, sometimes fail to obtain optimal control policies when state-action variables are continuous. In this paper, we present a real-coded GA for DPSs with IBP. Our method is specifically designed for continuous domains. Optimization of IBP has three difficulties; high-dimensionality, epistasis, and multi-modality. Our solution is designed for overcoming these difficulties. The policy search with IBP representation appears to be high-dimensional optimization; however, instances which can improve the fitness are often limited to active instances (instances used for the evaluation). In fact, the number of active instances is small. Therefore, we treat the search problem as a low dimensional problem by restricting search variables only to active instances. It has been commonly known that functions with epistasis can be efficiently optimized with crossovers which satisfy the inheritance of statistics. For efficient search of IBP, we propose extended crossover-like mutation (extended XLM) which generates a new instance around an instance with satisfying the inheritance of statistics. For overcoming multi-modality, we

  11. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  12. Enhancing the power of genetic association studies through the use of silver standard cases derived from electronic medical records.

    Directory of Open Access Journals (Sweden)

    Andrew McDavid

    Full Text Available The feasibility of using imperfectly phenotyped "silver standard" samples identified from electronic medical record diagnoses is considered in genetic association studies when these samples might be combined with an existing set of samples phenotyped with a gold standard technique. An analytic expression is derived for the power of a chi-square test of independence using either research-quality case/control samples alone, or augmented with silver standard data. The subset of the parameter space where inclusion of silver standard samples increases statistical power is identified. A case study of dementia subjects identified from electronic medical records from the Electronic Medical Records and Genomics (eMERGE network, combined with subjects from two studies specifically targeting dementia, verifies these results.

  13. Risk of iron overload in carriers of genetic mutations associated with hereditary haemochromatosis: UK Food Standards Agency workshop.

    Science.gov (United States)

    Singh, Mamta; Ashwell, Margaret; Sanderson, Peter; Cade, Janet; Moreton, Jennifer; Fairweather-Tait, Susan; Roe, Mark; Marx, Joannes J M; Worwood, Mark; Cook, James D

    2006-10-01

    The UK Food Standards Agency convened a group of expert scientists to review current research investigating diet and carriers of genetic mutations associated with hereditary haemochromatosis. The workshop concluded that individuals who are heterozygous for the C282Y mutation of the HFE gene do not appear to respond abnormally to dietary Fe and therefore do not need to change their diet to prevent accumulation of body Fe.

  14. The Future of Genetics in Psychology and Psychiatry: Microarrays, Genome-Wide Association, and Non-Coding RNA

    Science.gov (United States)

    Plomin, Robert; Davis, Oliver S. P.

    2009-01-01

    Background: Much of what we thought we knew about genetics needs to be modified in light of recent discoveries. What are the implications of these advances for identifying genes responsible for the high heritability of many behavioural disorders and dimensions in childhood? Methods: Although quantitative genetics such as twin studies will continue…

  15. An evaluation of the effectiveness of the EPA comply code to demonstrate compliance with radionuclide emission standards at three manufacturing facilities

    International Nuclear Information System (INIS)

    Smith, L.R.; Laferriere, J.R.; Nagy, J.W.

    1991-01-01

    Measurements of airborne radionuclide emissions and associated environmental concentrations were made at, and in the vicinity of, two urban and one suburban facility where radiolabeled chemicals for biomedical research and radiopharmaceuticals are manufactured. Emission, environmental and meteorological measurements were used in the EPA COMPLY code and in environmental assessment models developed specifically for these sites to compare their ability to predict off-site measurements. The models and code were then used to determine potential dose to hypothetical maximally exposed receptors and the ability of these methods to demonstrate whether these facilities comply with proposed radionuclide emission standards assessed. In no case did the models and code seriously underestimate off-site impacts. However, for certain radionuclides and chemical forms, the EPA COMPLY code was found to overestimate off-site impacts by such a large factor as to render its value questionable for determining regulatory compliance. Recommendations are offered for changing the code to enable it to be more serviceable to radionuclide users and regulators

  16. Paraguay; Report on the Observance of Standards and Codes: FATF Recommendations for Anti-Money Laundering and Combating the Financing of Terrorism

    OpenAIRE

    International Monetary Fund

    2009-01-01

    This paper discusses assessment results on the observance of standards and codes on the Financial Action Task Force (FATF) recommendations for antimoney laundering and combating the financing of terrorism (AML/CFT) for Paraguay. The assessment reveals that the substantial U.S. dollar contraband trade that occurs on the borders shared with Argentina and Brazil facilitates money laundering in Paraguay. Achievements in the implementation of Paraguay’s AML framework remain modest since the crimin...

  17. Activities of the Commission of the European Communities in the field of codes and standards for FBRs

    International Nuclear Information System (INIS)

    Terzaghi, A.

    1987-01-01

    A description of the organization set up by the Commission of European Communities to study problems, compare information within the member nations, and with other industrial nations for the preparation of guides and codes for the components of the LMFBR is given. Work performed and currently in progress is given on structural analysis, materials, and classification of components. (orig.)

  18. Codex general standard for irradiated foods and recommended international code of practice for the operation of radiation facilities used for the treatment of foods

    International Nuclear Information System (INIS)

    1990-06-01

    The FAO/WHO Codex Alimentarius Commission was established to implement the Joint FAO/WHO Food Standards Programme. The purpose of this programme is to protect the health of consumers and to ensure fair practices in the food trade. At its 15th session, held in July 1983, the Commission adopted a Codex General Standard for Irradiated Foods and a Recommended International Code of Practice for the Operation of Radiation Facilities used for the Treatment of Foods. This Standard takes into account the recommendations and conclusions of the Joint FAO/IAEA/WHO Expert Committees convened to evaluate all available data concerning the various aspects of food irradiation. This Standard refers only to those aspects which relate to the processing of foods by ionising energy. The Standard recognizes that the process of food irradiation has been established as safe for general application to an overall average level of absorbed dose of 10 KGy. The latter value shold not be regarded as a toxicological upper limit above which irradiated foods become unsafe; it is simply the level at or below which safety has been established. The Standard provides certain mandatory provisions concerning the facilities used and for the control of the process in the irradiation plants. The present Standard requires that shipping documents accompanying irradiated foods moving in trade should indicate the fact of irradiation. The labelling of prepackaged irradiated foods intended for direct sale to the consumer is not covered in this Standard

  19. Codex general standard for irradiated foods and recommended international code of practice for the operation of radiation facilities used for the treatment of foods

    International Nuclear Information System (INIS)

    1984-01-01

    The FAO/WHO Codex Alimentarius Commission was established to implement the Joint FAO/WHO Food Standards Programme. The purpose of this programme is to protect the health of consumers and to ensure fair practices in the food trade. At its 15th session, held in July 1983, the Commission adopted a Codex General Standard for Irradiated Foods and a Recommended International Code of Practice for the Operation of Radiation Facilities used for the Treatment of Foods. This Standard takes into account the recommendations and conclusions of the Joint FAO/IAEA/WHO Expert Committees convened to evaluate all available data concerning the various aspects of food irradiation. This Standard refers only to those aspects which relate to the processing of foods by ionising energy. The Standard recognizes that the process of food irradiation has been established as safe for general application to an overall average level of absorbed dose of 10 kGy. The latter value should not be regarded as a toxicological upper limit above which irradiated foods become unsafe; it is simply the level at or below which safety has been established. The Standard provides certain mandatory provisions concerning the facilities used and for the control of the process in the irradiation plants. The present Standard requires that shipping documents accompanying irradiated foods moving in trade should indicate the fact of irradiation. The labelling of prepackaged irradiated foods intended for direct sale to the consumer is not covered in this Standard

  20. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  1. Assessment of United States industry structural codes and standards for application to advanced nuclear power reactors: Appendices. Volume 2

    International Nuclear Information System (INIS)

    Adams, T.M.; Stevenson, J.D.

    1995-10-01

    Throughout its history, the USNRC has remained committed to the use of industry consensus standards for the design, construction, and licensing of commercial nuclear power facilities. The existing industry standards are based on the current class of light water reactors and as such may not adequately address design and construction features of the next generation of Advanced Light Water Reactors and other types of Advanced Reactors. As part of their on-going commitment to industry standards, the USNRC commissioned this study to evaluate US industry structural standards for application to Advanced Light Water Reactors and Advanced Reactors. The initial review effort included (1) the review and study of the relevant reactor design basis documentation for eight Advanced Light Water Reactors and Advanced Reactor Designs, (2) the review of the USNRCs design requirements for advanced reactors, (3) the review of the latest revisions of the relevant industry consensus structural standards, and (4) the identification of the need for changes to these standards. The results of these studies were used to develop recommended changes to industry consensus structural standards which will be used in the construction of Advanced Light Water Reactors and Advanced Reactors. Over seventy sets of proposed standard changes were recommended and the need for the development of four new structural standards was identified. In addition to the recommended standard changes, several other sets of information and data were extracted for use by USNRC in other on-going programs. This information included (1) detailed observations on the response of structures and distribution system supports to the recent Northridge, California (1994) and Kobe, Japan (1995) earthquakes, (2) comparison of versions of certain standards cited in the standard review plan to the most current versions, and (3) comparison of the seismic and wind design basis for all the subject reactor designs

  2. Absorbed dose determination in external beam radiotherapy. An international code of practice for dosimetry based on standards of absorbed dose to water

    International Nuclear Information System (INIS)

    2000-01-01

    The International Atomic Energy Agency published in 1987 an International Code of Practice entitled 'Absorbed Dose Determination in Photon and Electron Beams' (IAEA Technical Reports Series No. 277 (TRS-277)), recommending procedures to obtain the absorbed dose in water from measurements made with an ionization chamber in external beam radiotherapy. A second edition of TRS-277 was published in 1997 updating the dosimetry of photon beams, mainly kilovoltage X rays. Another International Code of Practice for radiotherapy dosimetry entitled 'The Use of Plane-Parallel Ionization Chambers in High Energy Electron and Photon Beams' (IAEA Technical Reports Series No. 381 (TRS-381)) was published in 1997 to further update TRS-277 and complement it with respect to the area of parallel-plate ionization chambers. Both codes have proven extremely valuable for users involved in the dosimetry of the radiation beams used in radiotherapy. In TRS-277 the calibration of the ionization chambers was based on primary standards of air kerma; this procedure was also used in TRS-381, but the new trend of calibrating ionization chambers directly in a water phantom in terms of absorbed dose to water was introduced. The development of primary standards of absorbed dose to water for high energy photon and electron beams, and improvements in radiation dosimetry concepts, offer the possibility of reducing the uncertainty in the dosimetry of radiotherapy beams. The dosimetry of kilovoltage X rays, as well as that of proton and heavy ion beams, interest in which has grown considerably in recent years, can also be based on these standards. Thus a coherent dosimetry system based on standards of absorbed dose to water is possible for practically all radiotherapy beams. Many Primary Standard Dosimetry Laboratories (PSDLs) already provide calibrations in terms of absorbed dose to water at the radiation quality of 60 Co gamma rays. Some laboratories have extended calibrations to high energy photon and

  3. Genic non-coding microsatellites in the rice genome: characterization, marker design and use in assessing genetic and evolutionary relationships among domesticated groups

    Directory of Open Access Journals (Sweden)

    Singh Nagendra

    2009-03-01

    Full Text Available Abstract Background Completely sequenced plant genomes provide scope for designing a large number of microsatellite markers, which are useful in various aspects of crop breeding and genetic analysis. With the objective of developing genic but non-coding microsatellite (GNMS markers for the rice (Oryza sativa L. genome, we characterized the frequency and relative distribution of microsatellite repeat-motifs in 18,935 predicted protein coding genes including 14,308 putative promoter sequences. Results We identified 19,555 perfect GNMS repeats with densities ranging from 306.7/Mb in chromosome 1 to 450/Mb in chromosome 12 with an average of 357.5 GNMS per Mb. The average microsatellite density was maximum in the 5' untranslated regions (UTRs followed by those in introns, promoters, 3'UTRs and minimum in the coding sequences (CDS. Primers were designed for 17,966 (92% GNMS repeats, including 4,288 (94% hypervariable class I types, which were bin-mapped on the rice genome. The GNMS markers were most polymorphic in the intronic region (73.3% followed by markers in the promoter region (53.3% and least in the CDS (26.6%. The robust polymerase chain reaction (PCR amplification efficiency and high polymorphic potential of GNMS markers over genic coding and random genomic microsatellite markers suggest their immediate use in efficient genotyping applications in rice. A set of these markers could assess genetic diversity and establish phylogenetic relationships among domesticated rice cultivar groups. We also demonstrated the usefulness of orthologous and paralogous conserved non-coding microsatellite (CNMS markers, identified in the putative rice promoter sequences, for comparative physical mapping and understanding of evolutionary and gene regulatory complexities among rice and other members of the grass family. The divergence between long-grained aromatics and subspecies japonica was estimated to be more recent (0.004 Mya compared to short

  4. Comparison of European codes and standards on the welding of LMFBR components and proposals for their harmonization

    International Nuclear Information System (INIS)

    Koehler, S.

    1992-01-01

    A comparative study has been conducted, within the framework of the exercises of comparisons of specifications and standards for fast reactors in the following specialized fields: - welding supervisor, welder; - welder's tests; -production test specimens of welds; - measures to prevent mistakes with weld material. The relevant specifications were forwarded by the national delegations: Germany, France, Italy and United Kingdom. The comparison has been presented in tabular form where rules for a particular sub-group of specialized field are laid down in the standards of at least two Member States. In each case, the conclusions and requirements set out in the national standards have been compared in relation to a specific comparison criterion. The quantitative comparisons of the requirements laid down in the individual national standards are assessed from the following standpoints: a) points of agreement between the regulations in the standards of all four Member states (Germany, France, United Kingdom and Italy); b) significant differences between the regulations. 13 tabs

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    International Nuclear Information System (INIS)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries

  6. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR.

  7. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.

  8. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    International Nuclear Information System (INIS)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR

  9. Crystal structure prediction of flexible molecules using parallel genetic algorithms with a standard force field.

    Science.gov (United States)

    Kim, Seonah; Orendt, Anita M; Ferraro, Marta B; Facelli, Julio C

    2009-10-01

    This article describes the application of our distributed computing framework for crystal structure prediction (CSP) the modified genetic algorithms for crystal and cluster prediction (MGAC), to predict the crystal structure of flexible molecules using the general Amber force field (GAFF) and the CHARMM program. The MGAC distributed computing framework includes a series of tightly integrated computer programs for generating the molecule's force field, sampling crystal structures using a distributed parallel genetic algorithm and local energy minimization of the structures followed by the classifying, sorting, and archiving of the most relevant structures. Our results indicate that the method can consistently find the experimentally known crystal structures of flexible molecules, but the number of missing structures and poor ranking observed in some crystals show the need for further improvement of the potential. Copyright 2009 Wiley Periodicals, Inc.

  10. Inventory of power plants in the United States. [By state within standard Federal Regions, using county codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    The purpose of this inventory of power plants is to provide a ready reference for planners whose focus is on the state, standard Federal region, and/or national level. Thus the inventory is compiled alphabetically by state within standard Federal regions. The units are listed alphabetically within electric utility systems which in turn are listed alphabetically within states. The locations are identified to county level according to the Federal Information Processing Standards Publication Counties and County Equivalents of the States of the United States. Data compiled include existing and projected electrical generation units, jointly owned units, and projected construction units.

  11. Laboratory diagnosis of creatine deficiency syndromes: a technical standard and guideline of the American College of Medical Genetics and Genomics.

    Science.gov (United States)

    Sharer, J Daniel; Bodamer, Olaf; Longo, Nicola; Tortorelli, Silvia; Wamelink, Mirjam M C; Young, Sarah

    2017-02-01

    Disclaimer: These ACMG Standards and Guidelines are intended as an educational resource for clinical laboratory geneticists to help them provide quality clinical laboratory genetic services. Adherence to these standards and guidelines is voluntary and does not necessarily assure a successful medical outcome. These Standards and Guidelines should not be considered inclusive of all proper procedures and tests or exclusive of others that are reasonably directed to obtaining the same results. In determining the propriety of any specific procedure or test, clinical laboratory geneticists should apply their professional judgment to the specific circumstances presented by the patient or specimen. Clinical laboratory geneticists are encouraged to document in the patient's record the rationale for the use of a particular procedure or test, whether or not it is in conformance with these Standards and Guidelines. They also are advised to take notice of the date any particular guideline was adopted, and to consider other relevant medical and scientific information that becomes available after that date. It also would be prudent to consider whether intellectual property interests may restrict the performance of certain tests and other procedures.Cerebral creatine deficiency syndromes are neurometabolic conditions characterized by intellectual disability, seizures, speech delay, and behavioral abnormalities. Several laboratory methods are available for preliminary and confirmatory diagnosis of these conditions, including measurement of creatine and related metabolites in biofluids using liquid chromatography-tandem mass spectrometry or gas chromatography-mass spectrometry, enzyme activity assays in cultured cells, and DNA sequence analysis. These guidelines are intended to standardize these procedures to help optimize the diagnosis of creatine deficiency syndromes. While biochemical methods are emphasized, considerations for confirmatory molecular testing are also discussed

  12. Pharmacogenetics of clopidogrel: comparison between a standard and a rapid genetic testing.

    Science.gov (United States)

    Saracini, Claudia; Vestrini, Anna; Galora, Silvia; Armillis, Alessandra; Abbate, Rosanna; Giusti, Betti

    2012-06-01

    CYP2C19 variant alleles are independent predictors of clopidogrel response variability and occurrence of major adverse cardiovascular events in high-risk vascular patients on clopidogrel therapy. Increasing evidence suggests a combination of platelet function testing with CYP2C19 genetic testing may be more effective in identifying high-risk individuals for alternative antiplatelet therapeutic strategies. A crucial point in evaluating the use of these polymorphisms in clinical practice, besides test accuracy, is the cost of the genetic test and rapid availability of the results. One hundred acute coronary syndrome patients were genotyped for CYP2C19*2,*3,*4,*5, and *17 polymorphisms with two platforms: Verigene(®) and the TaqMan(®) system. Genotyping results obtained by the classical TaqMan approach and the rapid Verigene approach showed a 100% concordance for all the five polymorphisms investigated. The Verigene system had shorter turnaround time with respect to TaqMan. The cost of reagents for TaqMan genotyping was lower than that for the Verigene system, but the effective manual staff involvement and the relative cost resulted in higher cost for TaqMan than for Verigene. The Verigene system demonstrated good performance in terms of turnaround time and cost for the evaluation of the clopidogrel poor metabolizer status, giving genetic information in suitable time (206 min) for a therapeutic strategy decision.

  13. Genetics

    International Nuclear Information System (INIS)

    Hubitschek, H.E.

    1975-01-01

    Progress is reported on the following research projects: genetic effects of high LET radiations; genetic regulation, alteration, and repair; chromosome replication and the division cycle of Escherichia coli; effects of radioisotope decay in the DNA of microorganisms; initiation and termination of DNA replication in Bacillus subtilis; mutagenesis in mouse myeloma cells; lethal and mutagenic effects of near-uv radiation; effect of 8-methoxypsoralen on photodynamic lethality and mutagenicity in Escherichia coli; DNA repair of the lethal effects of far-uv; and near uv irradiation of bacterial cells

  14. Assessment of genetic mutations in the XRCC2 coding region by high resolution melting curve analysis and the risk of differentiated thyroid carcinoma in Iran

    Directory of Open Access Journals (Sweden)

    Shima Fayaz

    2012-01-01

    Full Text Available Homologous recombination (HR is the major pathway for repairing double strand breaks (DSBs in eukaryotes and XRCC2 is an essential component of the HR repair machinery. To evaluate the potential role of mutations in gene repair by HR in individuals susceptible to differentiated thyroid carcinoma (DTC we used high resolution melting (HRM analysis, a recently introduced method for detecting mutations, to examine the entire XRCC2 coding region in an Iranian population. HRM analysis was used to screen for mutations in three XRCC2 coding regions in 50 patients and 50 controls. There was no variation in the HRM curves obtained from the analysis of exons 1 and 2 in the case and control groups. In exon 3, an Arg188His polymorphism (rs3218536 was detected as a new melting curve group (OR: 1.46; 95%CI: 0.432-4.969; p = 0.38 compared with the normal melting curve. We also found a new Ser150Arg polymorphism in exon 3 of the control group. These findings suggest that genetic variations in the XRCC2 coding region have no potential effects on susceptibility to DTC. However, further studies with larger populations are required to confirm this conclusion.

  15. Genetics

    DEFF Research Database (Denmark)

    Christensen, Kaare; McGue, Matt

    2016-01-01

    The sequenced genomes of individuals aged ≥80 years, who were highly educated, self-referred volunteers and with no self-reported chronic diseases were compared to young controls. In these data, healthy ageing is a distinct phenotype from exceptional longevity and genetic factors that protect...

  16. Use of PRIM code to analyze potential radiation-induced genetic and somatic effects to man from Jackpile-Paguate mines

    International Nuclear Information System (INIS)

    Momeni, M.H.

    1983-01-01

    Potential radiation-induced effects from inhalation and ingestion of land external exposure to radioactive materials at the Jackpile-Paguate uranium mine complex near Paguate, New Mexico, were analyzed. The Uranium Dispersion and Dosimetry (UDAD) computer code developed at Argonne National Laboratory was used to calculate the dose rates and the time-integrated doses to tissues at risk as a function of age and time for the population within 80 km of the mines. The ANL computer code Potential Radiation-Induced Biological Effects on Man (PRIM) then was used to calculate the potential radiation-induced somatic and genetic effects among the same population on the basis of absolute and relative risk models as a function of duration of exposure and age at time of exposure. The analyses were based on the recommendations in BEIR II and WASH-1400 and the lifetable method. The death rates were calculated for radiation exposure from the mines and for naturally induced effects for 19 age cohorts, 20 time intervals, and for each sex. The results indicated that under present conditions of the radiation environment at the mines, the number of potential fatal radiation-induced neoplasms that could occur among the regional population over the next 85 years would be 95 using the absolute risk model, and 243 using the relative risk model. Over the same period, there would be less than two radiation-induced genetic effects (dominant and multifactorials). After decommissioning f the mine site, these risks would decrease to less than 1 and less than 3 potential radiation-induced deaths under the relative and absolute risk models, respectively, and 0.001 genetic disorders. Because of various sources of error, the uncertainty in these predicted risks could be a factor of five

  17. The Poitiers School of Mathematical and Theoretical Biology: Besson-Gavaudan-Schützenberger's Conjectures on Genetic Code and RNA Structures.

    Science.gov (United States)

    Demongeot, J; Hazgui, H

    2016-12-01

    The French school of theoretical biology has been mainly initiated in Poitiers during the sixties by scientists like J. Besson, G. Bouligand, P. Gavaudan, M. P. Schützenberger and R. Thom, launching many new research domains on the fractal dimension, the combinatorial properties of the genetic code and related amino-acids as well as on the genetic regulation of the biological processes. Presently, the biological science knows that RNA molecules are often involved in the regulation of complex genetic networks as effectors, e.g., activators (small RNAs as transcription factors), inhibitors (micro-RNAs) or hybrids (circular RNAs). Examples of such networks will be given showing that (1) there exist RNA "relics" that have played an important role during evolution and have survived in many genomes, whose probability distribution of their sub-sequences is quantified by the Shannon entropy, and (2) the robustness of the dynamics of the networks they regulate can be characterized by the Kolmogorov-Sinaï dynamic entropy and attractor entropy.

  18. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  19. Input data preparation and simulation of the second standard problem of IAEA using the Trac/PF1 code

    International Nuclear Information System (INIS)

    Madeira, A.A.; Pontedeiro, A.C.; Silva Galetti, M.R. da; Borges, R.C.

    1989-10-01

    The second Standard Problem sponsored by IAEA consists in the simulation of a small LOCA located in the downcomer of a PMK-NVH integral test facility, which models WWER/440 type reactor. This report presents input data preparation and comparison between TRAC-PF1 results and experimental measurements. (author) [pt

  20. Comparison between Genetic Algorithms and Particle Swarm Optimization Methods on Standard Test Functions and Machine Design

    DEFF Research Database (Denmark)

    Nica, Florin Valentin Traian; Ritchie, Ewen; Leban, Krisztina Monika

    2013-01-01

    , genetic algorithm and particle swarm are shortly presented in this paper. These two algorithms are tested to determine their performance on five different benchmark test functions. The algorithms are tested based on three requirements: precision of the result, number of iterations and calculation time....... Both algorithms are also tested on an analytical design process of a Transverse Flux Permanent Magnet Generator to observe their performances in an electrical machine design application.......Nowadays the requirements imposed by the industry and economy ask for better quality and performance while the price must be maintained in the same range. To achieve this goal optimization must be introduced in the design process. Two of the best known optimization algorithms for machine design...

  1. Uruguay; Report on Observance of Standards and Codes-Data Module and the Response by the Authorities

    OpenAIRE

    International Monetary Fund

    2001-01-01

    The paper provides a summary of Uruguay's practices with respect to the coverage, periodicity, and timeliness of the Special Data Dissemination Standard (SDDS) data categories, and an assessment of the quality of national accounts, prices, fiscal, monetary and financial, and external sector statistics. Uruguay has made good progress recently in improving the dissemination of statistical information. The Internet pages of the Central Bank of Uruguay (BCU) and the National Institute of Statisti...

  2. Development of an accident consequence assessment code for evaluating site suitability of light- and heavy-water reactors based on the Korean Technical standards

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Won Tae; Jeong, Hae Sung; Jeong, Hyo Joon; Kil, A Reum; Kim, Eun Han; Han, Moon Hee [Nuclear Environment Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-12-15

    Methodologies for a series of radiological consequence assessments show a distinctive difference according to the design principles of the original nuclear suppliers and their technical standards to be imposed. This is due to the uncertainties of the accidental source term, radionuclide behavior in the environment, and subsequent radiological dose. Both types of PWR and PHWR are operated in Korea. However, technical standards for evaluating atmospheric dispersion have been enacted based on the U.S. NRC's positions regardless of the reactor types. For this reason, it might cause a controversy between the licensor and licensee of a nuclear power plant. It was modelled under the framework of the NRC Regulatory Guide 1.145 for light-water reactors, reflecting the features of heavy-water reactors as specified in the Canadian National Standard and the modelling features in MACCS2, such as atmospheric diffusion coefficient, ground deposition, surface roughness, radioactive plume depletion, and exposure from ground deposition. An integrated accident consequence assessment code, ACCESS (Accident Consequence Assessment Code for Evaluating Site Suitability), was developed by taking into account the unique regulatory positions for reactor types under the framework of the current Korean technical standards. Field tracer experiments and hand calculations have been carried out for validation and verification of the models. The modelling approaches of ACCESS and its features are introduced, and its applicative results for a hypothetical accidental scenario are comprehensively discussed. In an applicative study, the predicted results by the light-water reactor assessment model were higher than those by other models in terms of total doses.

  3. Optimization of traceable coaxial RF reflection standards with 7-mm-N-connector using genetic algorithms

    Directory of Open Access Journals (Sweden)

    T. Schrader

    2003-01-01

    Full Text Available A new coaxial device with 7-mm-N-connector was developed providing calculable complex reflection coefficients for traceable calibration of vector network analyzers (VNA. It was specifically designed to fill the gap between 0 Hz (DC, direct current and 250MHz, though the device was tested up to 10GHz. The frequency dependent reflection coefficient of this device can be described by a model, which is characterized by traceable measurements. It is therefore regarded as a “traceable model". The new idea of using such models for traceability has been verified, found to be valid and was used for these investigations. The DC resistance value was extracted from RF measurements up to 10 GHz by means of Genetic Algorithms (GA. The GA was used to obtain the elements of the model describing the reflection coefficient Γ of a network of SMD resistors. The DC values determined with the GA from RF measurements match the traceable value at DC within 3·10-3, which is in good agreement with measurements using reference air lines at GHz frequencies.

  4. The Genetic Intractability Of Symbiodinium microadriaticum To Standard Algal Transformation Methods

    KAUST Repository

    Chen, Jit Ern

    2017-05-23

    Modern transformation and genome editing techniques have shown great success across a broad variety of organisms. However, no study of successfully applied genome editing has been reported in a dinoflagellate despite the first genetic transformation of Symbiodinium being published about 20 years ago. Using an array of different available transformation techniques, we attempted to transform Symbiodinium microadriaticum (CCMP2467), a dinoflagellate symbiont of reef-building corals, in order to perform CRISPR-Ca9 mediated genome editing. Plasmid vectors containing the chloramphenicol resistance gene under the control of the CaMV p35S promoter as well as several putative endogenous promoters were used to test a variety of transformation techniques including biolistics, electroporation, silica whiskers and glass bead agitation. We report that we have been unable to confer chloramphenicol resistance to our specific Symbiodinium strain. These results are intended to provide other researchers with an overview of previously attempted techniques and sequences in order to support efficient planning of future experiments in this important field.

  5. Social Welfare Improvement by TCSC using Real Code Based Genetic Algorithm in Double-Sided Auction Market

    Directory of Open Access Journals (Sweden)

    MASOUM, M. A. S.

    2011-05-01

    Full Text Available This paper presents a genetic algorithm (GA to maximize total system social welfare and alleviate congestion by best placement and sizing of TCSC device, in a double-sided auction market. To introduce more accurate modeling, the valve loading effects is incorporated to the conventional quadratic smooth generator cost curves. By adding the valve point effect, the model presents nondifferentiable and nonconvex regions that challenge most gradient-based optimization algorithms. In addition, quadratic consumer benefit functions integrated in the objective function to guarantee that locational marginal prices charged at the demand buses is less than or equal to DisCos benefit, earned by selling that power to retail customers. The proposed approach makes use of the genetic algorithm to optimal schedule GenCos, DisCos and TCSC location and size, while the Newton-Raphson algorithm minimizes the mismatch of the power flow equations. Simulation results on the modified IEEE 14-bus and 30-bus test systems (with/without line flow constraints, before and after the compensation are used to examine the impact of TCSC on the total system social welfare improvement. Several cases are considered to test and validate the consistency of detecting best solutions. Simulation results are compared to solutions obtained by sequential quadratic programming (SQP approaches.

  6. A method to optimize the shield compact and lightweight combining the structure with components together by genetic algorithm and MCNP code.

    Science.gov (United States)

    Cai, Yao; Hu, Huasi; Pan, Ziheng; Hu, Guang; Zhang, Tao

    2018-05-17

    To optimize the shield for neutrons and gamma rays compact and lightweight, a method combining the structure and components together was established employing genetic algorithms and MCNP code. As a typical case, the fission energy spectrum of 235 U which mixed neutrons and gamma rays was adopted in this study. Six types of materials were presented and optimized by the method. Spherical geometry was adopted in the optimization after checking the geometry effect. Simulations have made to verify the reliability of the optimization method and the efficiency of the optimized materials. To compare the materials visually and conveniently, the volume and weight needed to build a shield are employed. The results showed that, the composite multilayer material has the best performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Event-specific qualitative and quantitative detection of five genetically modified rice events using a single standard reference molecule.

    Science.gov (United States)

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2017-07-01

    One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Scatter-Reducing Sounding Filtration Using a Genetic Algorithm and Mean Monthly Standard Deviation

    Science.gov (United States)

    Mandrake, Lukas

    2013-01-01

    Retrieval algorithms like that used by the Orbiting Carbon Observatory (OCO)-2 mission generate massive quantities of data of varying quality and reliability. A computationally efficient, simple method of labeling problematic datapoints or predicting soundings that will fail is required for basic operation, given that only 6% of the retrieved data may be operationally processed. This method automatically obtains a filter designed to reduce scatter based on a small number of input features. Most machine-learning filter construction algorithms attempt to predict error in the CO2 value. By using a surrogate goal of Mean Monthly STDEV, the goal is to reduce the retrieved CO2 scatter rather than solving the harder problem of reducing CO2 error. This lends itself to improved interpretability and performance. This software reduces the scatter of retrieved CO2 values globally based on a minimum number of input features. It can be used as a prefilter to reduce the number of soundings requested, or as a post-filter to label data quality. The use of the MMS (Mean Monthly Standard deviation) provides a much cleaner, clearer filter than the standard ABS(CO2-truth) metrics previously employed by competitor methods. The software's main strength lies in a clearer (i.e., fewer features required) filter that more efficiently reduces scatter in retrieved CO2 rather than focusing on the more complex (and easily removed) bias issues.

  9. Incorporating personalized gene sequence variants, molecular genetics knowledge, and health knowledge into an EHR prototype based on the Continuity of Care Record standard

    Science.gov (United States)

    Jing, Xia; Kay, Stephen; Marley, Tom; Hardiker, Nicholas R.; Cimino, James J.

    2011-01-01

    Summary Objectives The current volume and complexity of genetic tests, and the molecular genetics knowledge and health knowledge related to interpretation of the results of those tests, are rapidly outstripping the ability of individual clinicians to recall, understand and convey to their patients information relevant to their care. The tailoring of molecular genetics knowledge and health knowledge in clinical settings is important both for the provision of personalized medicine and to reduce clinician information overload. In this paper we describe the incorporation, customization and demonstration of molecular genetic data (mainly sequence variants), molecular genetics knowledge and health knowledge into a standards-based electronic health record (EHR) prototype developed specifically for this study. Methods We extended the CCR (Continuity of Care Record), an existing EHR standard for representing clinical data, to include molecular genetic data. An EHR prototype was built based on the extended CCR and designed to display relevant molecular genetics knowledge and health knowledge from an existing knowledge base for cystic fibrosis (OntoKBCF). We reconstructed test records from published case reports and represented them in the CCR schema. We then used the EHR to dynamically filter molecular genetics knowledge and health knowledge from OntoKBCF using molecular genetic data and clinical data from the test cases. Results The molecular genetic data were successfully incorporated in the CCR by creating a category of laboratory results called “Molecular Genetics ” and specifying a particular class of test (“Gene Mutation Test”) in this category. Unlike other laboratory tests reported in the CCR, results of tests in this class required additional attributes (“Molecular Structure” and “Molecular Position”) to support interpretation by clinicians. These results, along with clinical data (age, sex, ethnicity, diagnostic procedures, and therapies) were used

  10. National standards and code compliance for electrical equipment and instruments installed in hazardous locations for the cone penetrometer

    International Nuclear Information System (INIS)

    Bussell, J.H.

    1996-03-01

    The cone penetrometer is designed to measure the material properties of waste tank contents at the Hanford Site. The penetrometer system consists of a skid-mounted assembly, a penetrometer assembly (composed of a guide tube and a push rod), an active neutron moisture measurement probe, decontamination unit, and a support trailer containing a diesel-engine-driven hydraulic pump and a generator. The skid-mounted assembly is about 8 feet wide by 23 feet long and 15 feet high. Its nominal weight is about 40,000 pounds with the provisions to add up to 54,500 pounds of additional ballast. This document describes the cone penetrometer electrical instruments and how it complies with national standards

  11. The calculation of wall and non-uniformity correction factors for the BIPM air-kerma standard for 60Co using the Monte Carlo code PENELOPE

    International Nuclear Information System (INIS)

    Burns, D.T.

    2002-01-01

    Traditionally, the correction factor k wall for attenuation and scatter in the walls of cavity ionization chamber primary standards has been evaluated experimentally using an extrapolation method. During the past decade, there have been a number of Monte Carlo calculations of k wall indicating that for certain ionization chamber types the extrapolation method may not be valid. In particular, values for k wall have been proposed that, if adopted by each laboratory concerned, would have a significant effect on the results of international comparisons of air-kerma primary standards. The calculations have also proposed new values for the axial component k an of the point-source uniformity correction. Central to the results of international comparisons is the BIPM air-kerma standard. Unlike most others, the BIPM standard is of the parallel-plate design for which the extrapolation method for evaluating k wall should be valid. The value in use at present is k wall =1.0026 (standard uncertainty 0.0008). Rogers and Treurniet calculated the value k wall =1.0014 for the BIPM standard, which is in moderate agreement with the value in use (no overall uncertainty was given). However, they also calculated k an =1.0024 (statistical uncertainty 0.0003) which is very different from the value k an =0.9964 (0.0007) in use at present for the BIPM standard. A new 60 Co facility has recently been installed at the BIPM and the opportunity was taken to re-evaluate the correction factors for the BIPM standard in this new beam. Given that almost all of the Monte Carlo work to date has used the EGS Monte Carlo code, it was decided to use the code PENELOPE. The new source, container, head and collimating jaws were simulated in detail with more that fifty components being modelled, as shown. This model was used to create a phase-space file in the plane 90 cm from the source. The normalized distribution of photon number with energy is shown, where the various sources of scattered photons are

  12. An expanded genetic code for probing the role of electrostatics in enzyme catalysis by vibrational Stark spectroscopy.

    Science.gov (United States)

    Völler, Jan-Stefan; Biava, Hernan; Hildebrandt, Peter; Budisa, Nediljko

    2017-11-01

    To find experimental validation for electrostatic interactions essential for catalytic reactions represents a challenge due to practical limitations in assessing electric fields within protein structures. This review examines the applications of non-canonical amino acids (ncAAs) as genetically encoded probes for studying the role of electrostatic interactions in enzyme catalysis. ncAAs constitute sensitive spectroscopic probes to detect local electric fields by exploiting the vibrational Stark effect (VSE) and thus have the potential to map the protein electrostatics. Mapping the electrostatics in proteins will improve our understanding of natural catalytic processes and, in beyond, will be helpful for biocatalyst engineering. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A Real-Coded Genetic Algorithm with System Reduction and Restoration for Rapid and Reliable Power Flow Solution of Power Systems

    Directory of Open Access Journals (Sweden)

    Hassan Abdullah Kubba

    2015-05-01

    Full Text Available The paper presents a highly accurate power flow solution, reducing the possibility of ending at local minima, by using Real-Coded Genetic Algorithm (RCGA with system reduction and restoration. The proposed method (RCGA is modified to reduce the total computing time by reducing the system in size to that of the generator buses, which, for any realistic system, will be smaller in number, and the load buses are eliminated. Then solving the power flow problem for the generator buses only by real-coded GA to calculate the voltage phase angles, whereas the voltage magnitudes are specified resulted in reduced computation time for the solution. Then the system is restored by calculating the voltages of the load buses in terms of the calculated voltages of the generator buses, after a derivation of equations for calculating the voltages of the load busbars. The proposed method was demonstrated on 14-bus IEEE test systems and the practical system 362-busbar IRAQI NATIONAL GRID (ING. The proposed method has reliable convergence, a highly accurate solution and less computing time for on-line applications. The method can conveniently be applied for on-line analysis and planning studies of large power systems.

  14. Genetic Predictions of Prion Disease Susceptibility in Carnivore Species Based on Variability of the Prion Gene Coding Region

    Science.gov (United States)

    Stewart, Paula; Campbell, Lauren; Skogtvedt, Susan; Griffin, Karen A.; Arnemo, Jon M.; Tryland, Morten; Girling, Simon; Miller, Michael W.; Tranulis, Michael A.; Goldmann, Wilfred

    2012-01-01

    Mammalian species vary widely in their apparent susceptibility to prion diseases. For example, several felid species developed prion disease (feline spongiform encephalopathy or FSE) during the bovine spongiform encephalopathy (BSE) epidemic in the United Kingdom, whereas no canine BSE cases were detected. Whether either of these or other groups of carnivore species can contract other prion diseases (e.g. chronic wasting disease or CWD) remains an open question. Variation in the host-encoded prion protein (PrPC) largely explains observed disease susceptibility patterns within ruminant species, and may explain interspecies differences in susceptibility as well. We sequenced and compared the open reading frame of the PRNP gene encoding PrPC protein from 609 animal samples comprising 29 species from 22 genera of the Order Carnivora; amongst these samples were 15 FSE cases. Our analysis revealed that FSE cases did not encode an identifiable disease-associated PrP polymorphism. However, all canid PrPs contained aspartic acid or glutamic acid at codon 163 which we propose provides a genetic basis for observed susceptibility differences between canids and felids. Among other carnivores studied, wolverine (Gulo gulo) and pine marten (Martes martes) were the only non-canid species to also express PrP-Asp163, which may impact on their prion diseases susceptibility. Populations of black bear (Ursus americanus) and mountain lion (Puma concolor) from Colorado showed little genetic variation in the PrP protein and no variants likely to be highly resistant to prions in general, suggesting that strain differences between BSE and CWD prions also may contribute to the limited apparent host range of the latter. PMID:23236380

  15. Genetic predictions of prion disease susceptibility in carnivore species based on variability of the prion gene coding region.

    Directory of Open Access Journals (Sweden)

    Paula Stewart

    Full Text Available Mammalian species vary widely in their apparent susceptibility to prion diseases. For example, several felid species developed prion disease (feline spongiform encephalopathy or FSE during the bovine spongiform encephalopathy (BSE epidemic in the United Kingdom, whereas no canine BSE cases were detected. Whether either of these or other groups of carnivore species can contract other prion diseases (e.g. chronic wasting disease or CWD remains an open question. Variation in the host-encoded prion protein (PrP(C largely explains observed disease susceptibility patterns within ruminant species, and may explain interspecies differences in susceptibility as well. We sequenced and compared the open reading frame of the PRNP gene encoding PrP(C protein from 609 animal samples comprising 29 species from 22 genera of the Order Carnivora; amongst these samples were 15 FSE cases. Our analysis revealed that FSE cases did not encode an identifiable disease-associated PrP polymorphism. However, all canid PrPs contained aspartic acid or glutamic acid at codon 163 which we propose provides a genetic basis for observed susceptibility differences between canids and felids. Among other carnivores studied, wolverine (Gulo gulo and pine marten (Martes martes were the only non-canid species to also express PrP-Asp163, which may impact on their prion diseases susceptibility. Populations of black bear (Ursus americanus and mountain lion (Puma concolor from Colorado showed little genetic variation in the PrP protein and no variants likely to be highly resistant to prions in general, suggesting that strain differences between BSE and CWD prions also may contribute to the limited apparent host range of the latter.

  16. The African Lupus Genetics Network (ALUGEN) registry: standardized, prospective follow-up studies in African patients with systemic lupus erythematosus.

    Science.gov (United States)

    Hodkinson, B; Mapiye, D; Jayne, D; Kalla, A; Tiffin, N; Okpechi, I

    2016-03-01

    The prevalence and severity of systemic lupus erythematosus (SLE) differs between ethnic groups and geographical regions. Although initially reported as rare, there is growing evidence that SLE is prevalent and runs a severe course in Africa. There is a paucity of prospective studies on African SLE patients. The African Lupus Genetics Network (ALUGEN) is a multicentred framework seeking to prospectively assess outcomes in SLE patients in Africa. Outcomes measured will be death, hospital admission, disease activity flares, and SLE-related damage. We will explore predictors for these outcomes including clinical, serological, socio-demographic, therapeutic and genetic factors. Further, we will investigate comorbidities and health-related quality of life amongst these patients. Data of patients recently (≤ 5 yrs) diagnosed with SLE will be collected at baseline and annual follow-up visits, and captured electronically. The ALUGEN project will facilitate standardized data capture for SLE cases in Africa, allowing participating centres to develop their own SLE registries, and enabling collaboration to enrich our understanding of inter-ethnic and regional variations in disease expression. Comprehensive, high-quality multi-ethnic data on African SLE patients will expand knowledge of the disease and inform clinical practice, in addition to augmenting research capacity and networking links and providing a platform for future biomarker and interventional studies. © The Author(s) 2015.

  17. Construction of a Bacterial Artificial Chromosome Library of TM-1, a Standard Line for Genetics and Genomics in Upland Cotton

    Institute of Scientific and Technical Information of China (English)

    Yan Hu; Wang-Zhen Guo; Tian-Zhen Zhang

    2009-01-01

    A bacterial artificial chromosome (BAC) library was constructed for Gossyplum hirsutum acc. TM-1, a genetic and genomic standard line for Upland cotton. The library consists of 147 456 clones with an average insert size of 122.8 kb ranging from 97 to 240 kb. About 96.0% of the clones have inserts over 100 kb. Therefore, this library represents theoretically 7.4 haploid genome equivalents based on an AD genome size of 2 425 Mb. Clones were stored in 384 384- well plates and arrayed into multiplex pools for rapid and reliable library screening. BAC screening was carded out by four-round polymerase chain reactions using 23 simple sequence repeats (SSR) markers, three sequence-related amplified polymorphism markers and one pair of pdmere for a gene associated with fiber development to test the quality of the library. Correspondingly, in total 92 positive BAC clones were Identified with an average four positive clones per SSR marker, ranging from one to eight hits. Additionally, since these SSR markers have been localized to chromosome 12 (A12) and 26 (D12) according to the genetic map, these BAC clonee are expected to serve as seeds for the physical mapping of these two homologous chromosomes, sequentially map-based cloning of quantitative trait loci or genes associated with Important agronomic traits.

  18. Theoretical Characterization of the H-Bonding and Stacking Potential of Two Non-Standard Nucleobases Expanding the Genetic Alphabet

    KAUST Repository

    Chawla, Mohit

    2016-02-16

    We report a quantum chemical characterization of the non-natural (synthetic) H-bonded base pair formed by 6-amino-5-nitro-2(1H)-pyridone (Z) and 2-amino-imidazo [1,2-a]-1,3,5-triazin-4(8H)-one (P). The Z:P base pair, orthogonal to the classical G:C base pair, has been introduced in DNA molecules for expanding the genetic code. Our results indicate that the Z:P base pair closely mimics the G:C base pair both in terms of structure and stability. To clarify the role of the NO2 group on the C5 position of the Z base, we compared the stability of the Z:P base pair with that of base pairs having different functional group on the C5 position of Z. Our results indicate that the electron donating/withdrawing properties of the group in the C5 position has a clear impact on the stability of the Z:P base pair, with the strong electron withdrawing nitro group achieving the largest stabilizing effect on the H-bonding interaction, and the strong electron donating NH2 group destabilizing the Z:P pair by almost 4 kcal/mol. Finally, our gas phase and in water calculations confirm that the Z-nitro group reinforce the stacking interaction with its adjacent purine or pyrimidine ring.

  19. Development of a detailed BWR core thermal-hydraulic analysis method based on the Japanese post-BT standard using a best-estimate code

    International Nuclear Information System (INIS)

    Ono, H.; Mototani, A.; Kawamura, S.; Abe, N.; Takeuchi, Y.

    2004-01-01

    The post-BT standard is a new fuel integrity standard or the Atomic Energy Society of Japan that allows temporary boiling transition condition in the evaluation for BWR anticipated operational occurrences. For application of the post-BT standard to BWR anticipated operational occurrences evaluation, it is important to identify which fuel assemblies and which axial, radial positions of fuel rods have temporarily experienced the post-BT condition and to evaluates how high the fuel cladding temperature rise was and how long the dryout duration continued. Therefore, whole bundle simulation, in which each fuel assembly is simulated independently by one thermal-hydraulic component, is considered to be an effective analytical method. In the present study, a best-estimate thermal-hydraulic code, TRACG02, has been modified to extend it predictive capability by implementing the post-BT evaluation model such as the post-BT heat transfer correlation and rewetting correlation and enlarging the number of components used for BWR plant simulation. Based on new evaluation methods, BWR core thermal-hydraulic behavior has been analyzed for typical anticipated operational occurrence conditions. The location where boiling transition occurs and the severity of fuel assembly in the case of boiling transition conditions such as fuel cladding temperature, which are important factors in determining whether the reuse of the fuel assembly can be permitted, were well predicted by the proposed evaluation method. In summary, a new evaluation method for a detailed BWR core thermal-hydraulic analysis based on the post-BT standard of the Atomic Energy Society of Japan has been developed and applied to the evaluation of the post-BT standard during the actual BWR plant anticipated operational occurrences. (author)

  20. [Standards for treatment in forensic committment according to § 63 and § 64 of the German criminal code : Interdisciplinary task force of the DGPPN].

    Science.gov (United States)

    Müller, J L; Saimeh, N; Briken, P; Eucker, S; Hoffmann, K; Koller, M; Wolf, T; Dudeck, M; Hartl, C; Jakovljevic, A-K; Klein, V; Knecht, G; Müller-Isberner, R; Muysers, J; Schiltz, K; Seifert, D; Simon, A; Steinböck, H; Stuckmann, W; Weissbeck, W; Wiesemann, C; Zeidler, R

    2017-08-01

    People who have been convicted of a crime due to a severe mental disorder and continue to be dangerous as a result of this disorder may be placed in a forensic psychiatric facility for improvement and safeguarding according to § 63 and § 64 of the German Criminal Code (StGB). In Germany, approximately 9000 patients are treated in clinics for forensic psychiatry and psychotherapy on the basis of § 63 of the StGB and in withdrawal centers on the basis of § 64 StGB. The laws for treatment of patients in forensic commitment are passed by the individual States, with the result that even the basic conditions differ in the individual States. While minimum requirements have already been published for the preparation of expert opinions on liability and legal prognosis, consensus standards for the treatment in forensic psychiatry have not yet been published. Against this background, in 2014 the German Society for Psychiatry and Psychotherapy, Psychosomatics and Neurology (DGPPN) commissioned an interdisciplinary task force to develop professional standards for treatment in forensic psychiatry. Legal, ethical, structural, therapeutic and prognostic standards for forensic psychiatric treatment should be described according to the current state of science. After 3 years of work the results of the interdisciplinary working group were presented in early 2017 and approved by the board of the DGPPN. The standards for the treatment in the forensic psychiatric commitment aim to initiate a discussion in order to standardize the treatment conditions and to establish evidence-based recommendations.

  1. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding

    Directory of Open Access Journals (Sweden)

    Charlotte D’Hulst

    2016-07-01

    Full Text Available Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs in the main olfactory epithelium express the same odorant receptor (OR in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these “MouSensors.” In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction.

  2. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding.

    Science.gov (United States)

    D'Hulst, Charlotte; Mina, Raena B; Gershon, Zachary; Jamet, Sophie; Cerullo, Antonio; Tomoiaga, Delia; Bai, Li; Belluscio, Leonardo; Rogers, Matthew E; Sirotin, Yevgeniy; Feinstein, Paul

    2016-07-26

    Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs) in the main olfactory epithelium express the same odorant receptor (OR) in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these "MouSensors." In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. The revised APTA code of ethics for the physical therapist and standards of ethical conduct for the physical therapist assistant: theory, purpose, process, and significance.

    Science.gov (United States)

    Swisher, Laura Lee; Hiller, Peggy

    2010-05-01

    In June 2009, the House of Delegates (HOD) of the American Physical Therapy Association (APTA) passed a major revision of the APTA Code of Ethics for physical therapists and the Standards of Ethical Conduct for the Physical Therapist Assistant. The revised documents will be effective July 1, 2010. The purposes of this article are: (1) to provide a historical, professional, and theoretical context for this important revision; (2) to describe the 4-year revision process; (3) to examine major features of the documents; and (4) to discuss the significance of the revisions from the perspective of the maturation of physical therapy as a doctoring profession. PROCESS OF REVISION: The process for revision is delineated within the context of history and the Bylaws of APTA. FORMAT, STRUCTURE, AND CONTENT OF REVISED CORE ETHICS DOCUMENTS: The revised documents represent a significant change in format, level of detail, and scope of application. Previous APTA Codes of Ethics and Standards of Ethical Conduct for the Physical Therapist Assistant have delineated very broad general principles, with specific obligations spelled out in the Ethics and Judicial Committee's Guide for Professional Conduct and Guide for Conduct of the Physical Therapist Assistant. In contrast to the current documents, the revised documents address all 5 roles of the physical therapist, delineate ethical obligations in organizational and business contexts, and align with the tenets of Vision 2020. The significance of this revision is discussed within historical parameters, the implications for physical therapists and physical therapist assistants, the maturation of the profession, societal accountability and moral community, potential regulatory implications, and the inclusive and deliberative process of moral dialogue by which changes were developed, revised, and approved.

  4. FY 1999 project on the development of new industry support type international standards. Standardization of evaluation method of the genetic testing system (Separate volume); 1999 nendo shinki sangyo shiengata kokusai hyojunka kaihatsu jigyo seika hokokusho. Idenshi kensa system no hyoka hoho no hyojunka (bessatsu)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    The separate volume included the proceedings of the re-consignee joint meeting on the standardization of evaluation method of the genetic testing system, the proceedings/data of the meeting of the committee of the standardization of evaluation method of the genetic testing system, etc. The data of the meeting are about a plan to execute the standardization of evaluation method of the genetic testing system, standardization of forms for reporting the results of the genetic test, a trial guideline for standardization of the genetic testing, standardization of evaluation method of the genetic testing system, etc. Moreover, the volume included 11 literature papers overseas on the above-mentioned themes, 'reports on the surveys in Europe and the U.S. on the standardization of evaluation method of the genetic testing system,' etc. (NEDO)

  5. FY 1999 project on the development of new industry support type international standards. Standardization of evaluation method of the genetic testing system (Separate volume); 1999 nendo shinki sangyo shiengata kokusai hyojunka kaihatsu jigyo seika hokokusho. Idenshi kensa system no hyoka hoho no hyojunka (bessatsu)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    The separate volume included the proceedings of the re-consignee joint meeting on the standardization of evaluation method of the genetic testing system, the proceedings/data of the meeting of the committee of the standardization of evaluation method of the genetic testing system, etc. The data of the meeting are about a plan to execute the standardization of evaluation method of the genetic testing system, standardization of forms for reporting the results of the genetic test, a trial guideline for standardization of the genetic testing, standardization of evaluation method of the genetic testing system, etc. Moreover, the volume included 11 literature papers overseas on the above-mentioned themes, 'reports on the surveys in Europe and the U.S. on the standardization of evaluation method of the genetic testing system,' etc. (NEDO)

  6. The effect of genetic bottlenecks and inbreeding on the incidence of two major autoimmune diseases in standard poodles, sebaceous adenitis and Addison's disease.

    Science.gov (United States)

    Pedersen, Niels C; Brucker, Lynn; Tessier, Natalie Green; Liu, Hongwei; Penedo, Maria Cecilia T; Hughes, Shayne; Oberbauer, Anita; Sacks, Ben

    2015-01-01

    Sebaceous adenitis (SA) and Addison's disease (AD) increased rapidly in incidence among Standard Poodles after the mid-twentieth century. Previous attempts to identify specific genetic causes using genome wide association studies and interrogation of the dog leukocyte antigen (DLA) region have been non-productive. However, such studies led us to hypothesize that positive selection for desired phenotypic traits that arose in the mid-twentieth century led to intense inbreeding and the inadvertent amplification of AD and SA associated traits. This hypothesis was tested with genetic studies of 761 Standard, Miniature, and Miniature/Standard Poodle crosses from the USA, Canada and Europe, coupled with extensive pedigree analysis of thousands more dogs. Genome-wide diversity across the world-wide population was measured using a panel of 33 short tandem repeat (STR) loci. Allele frequency data were also used to determine the internal relatedness of individual dogs within the population as a whole. Assays based on linkage between STR genomic loci and DLA genes were used to identify class I and II haplotypes and disease associations. Genetic diversity statistics based on genomic STR markers indicated that Standard Poodles from North America and Europe were closely related and reasonably diverse across the breed. However, genetic diversity statistics, internal relatedness, principal coordinate analysis, and DLA haplotype frequencies showed a marked imbalance with 30 % of the diversity in 70 % of the dogs. Standard Poodles with SA and AD were strongly linked to this inbred population, with dogs suffering with SA being the most inbred. No single strong association was found between STR defined DLA class I or II haplotypes and SA or AD in the breed as a whole, although certain haplotypes present in a minority of the population appeared to confer moderate degrees of risk or protection against either or both diseases. Dogs possessing minor DLA class I haplotypes were half as

  7. Work relating to defect assessment undertaken by activity group 2 of the European Commission's working group on codes and standards. WGCS overview

    International Nuclear Information System (INIS)

    Townley, C.H.A.; Guinovart, J.

    1995-01-01

    For about twenty years, the Working Group on Codes and Standards has been an Advisory Group of the European Commission and three sub-groups AG1, AG2 and AG3, were formed to consider manufacture and inspection, structural mechanics and materials topics respectively. Representation on the Working Group and its sub-groups comes from designers, utilities and atomic energy agencies in those member States with active nuclear power programmes. There has also been a very valuable input from universities and research organisations in the countries concerned. The method of working is to identify topics on which there is a difference of opinion; projects are set up to review the up to date scientific and technological knowledge. The investigations are undertaken collaboratively by specialists from as many countries as can contribute and there is an obligation to reach conclusions which can be put to practical use by engineers. While the Working group and its sub-groups are not directly involved in the production of standards, there is a very important input to the pre-standardization process. The work produced by AG2 covered a wide range of subjects associated with structural integrity, mainly concerning the Fast Breeder Reactors. Since 1991 the Group has progressively set up Light Water Reactor programmes. Currently, most of efforts are devoted to Thermal Reactors with a minor extent to Fast Breeder Reactors. The present paper is mainly concerned with those aspects of the AG2 activities which have a bearing on defect assessment. Although work was initiated as part of the FBR programme, it must be remembered that the greater part of it can be extended to a wide range of high temperature plants. Concerning the LWR programmes, an overview on current selected studies is being provided in this paper. (authors). 23 refs

  8. The IPEM code of practice for determination of the reference air kerma rate for HDR 192Ir brachytherapy sources based on the NPL air kerma standard

    International Nuclear Information System (INIS)

    Bidmead, A M; Sander, T; Nutbrown, R F; Locks, S M; Lee, C D; Aird, E G A; Flynn, A

    2010-01-01

    This paper contains the recommendations of the high dose rate (HDR) brachytherapy working party of the UK Institute of Physics and Engineering in Medicine (IPEM). The recommendations consist of a Code of Practice (COP) for the UK for measuring the reference air kerma rate (RAKR) of HDR 192 Ir brachytherapy sources. In 2004, the National Physical Laboratory (NPL) commissioned a primary standard for the realization of RAKR of HDR 192 Ir brachytherapy sources. This has meant that it is now possible to calibrate ionization chambers directly traceable to an air kerma standard using an 192 Ir source (Sander and Nutbrown 2006 NPL Report DQL-RD 004 (Teddington: NPL) http://publications.npl.co.uk). In order to use the source specification in terms of either RAKR, .K R (ICRU 1985 ICRU Report No 38 (Washington, DC: ICRU); ICRU 1997 ICRU Report No 58 (Bethesda, MD: ICRU)), or air kerma strength, S K (Nath et al 1995 Med. Phys. 22 209-34), it has been necessary to develop algorithms that can calculate the dose at any point around brachytherapy sources within the patient tissues. The AAPM TG-43 protocol (Nath et al 1995 Med. Phys. 22 209-34) and the 2004 update TG-43U1 (Rivard et al 2004 Med. Phys. 31 633-74) have been developed more fully than any other protocol and are widely used in commercial treatment planning systems. Since the TG-43 formalism uses the quantity air kerma strength, whereas this COP uses RAKR, a unit conversion from RAKR to air kerma strength was included in the appendix to this COP. It is recommended that the measured RAKR determined with a calibrated well chamber traceable to the NPL 192 Ir primary standard is used in the treatment planning system. The measurement uncertainty in the source calibration based on the system described in this COP has been reduced considerably compared to other methods based on interpolation techniques.

  9. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  10. All about Genetics (For Parents)

    Science.gov (United States)

    ... Videos for Educators Search English Español All About Genetics KidsHealth / For Parents / All About Genetics What's in ... the way they pick up special laboratory dyes. Genetic Problems Errors in the genetic code or "gene ...

  11. Evaluation of liquefaction potential of soil based on standard penetration test using multi-gene genetic programming model

    Science.gov (United States)

    Muduli, Pradyut; Das, Sarat

    2014-06-01

    This paper discusses the evaluation of liquefaction potential of soil based on standard penetration test (SPT) dataset using evolutionary artificial intelligence technique, multi-gene genetic programming (MGGP). The liquefaction classification accuracy (94.19%) of the developed liquefaction index (LI) model is found to be better than that of available artificial neural network (ANN) model (88.37%) and at par with the available support vector machine (SVM) model (94.19%) on the basis of the testing data. Further, an empirical equation is presented using MGGP to approximate the unknown limit state function representing the cyclic resistance ratio (CRR) of soil based on developed LI model. Using an independent database of 227 cases, the overall rates of successful prediction of occurrence of liquefaction and non-liquefaction are found to be 87, 86, and 84% by the developed MGGP based model, available ANN and the statistical models, respectively, on the basis of calculated factor of safety (F s) against the liquefaction occurrence.

  12. A bacterial genetic screen identifies functional coding sequences of the insect mariner transposable element Famar1 amplified from the genome of the earwig, Forficula auricularia.

    Science.gov (United States)

    Barry, Elizabeth G; Witherspoon, David J; Lampe, David J

    2004-02-01

    Transposons of the mariner family are widespread in animal genomes and have apparently infected them by horizontal transfer. Most species carry only old defective copies of particular mariner transposons that have diverged greatly from their active horizontally transferred ancestor, while a few contain young, very similar, and active copies. We report here the use of a whole-genome screen in bacteria to isolate somewhat diverged Famar1 copies from the European earwig, Forficula auricularia, that encode functional transposases. Functional and nonfunctional coding sequences of Famar1 and nonfunctional copies of Ammar1 from the European honey bee, Apis mellifera, were sequenced to examine their molecular evolution. No selection for sequence conservation was detected in any clade of a tree derived from these sequences, not even on branches leading to functional copies. This agrees with the current model for mariner transposon evolution that expects neutral evolution within particular hosts, with selection for function occurring only upon horizontal transfer to a new host. Our results further suggest that mariners are not finely tuned genetic entities and that a greater amount of sequence diversification than had previously been appreciated can occur in functional copies in a single host lineage. Finally, this method of isolating active copies can be used to isolate other novel active transposons without resorting to reconstruction of ancestral sequences.

  13. Improvement of core monitoring code cecor by the virtual segmentation of the self powered neutron detector loaded at Korean Standard Nuclear Plant

    International Nuclear Information System (INIS)

    Choi, T.; Jung, Y.S.

    2006-01-01

    Full text: Full text: Korean Standard Nuclear Plant uses Self Powered Neutron Detectors (SPNDs) to measure the neutron flux in the reactor core. The SPND's height is 40 cm and is located axially at the five different positions and 45 radial places. The design code simulated a reactor core is calculated by segmentation of the core. The segmentation is called as 'node', of which size is normally 20 cm. The axial height of the detector is larger than that of the node, and the larger detector's height maybe product some error on the axially complex shape. The analysis with the detector's signals showed some errors at the non-cosine axial flux shape. In order to reduce the errors for the shape, we tried to divide the detector by introducing the virtual boundary in the detector. Then, each axially 5 detectors had two virtual segmentations respectively and the detector's signal was divided by the inputs. So the more virtual detector's signals were gotten, the more accurate axial shape was produced. The result with virtual segmentations in a detector gave less deviation than the case without virtual segmentation (the current model). After the middle of cycle at the initial core specially, the axial neutron flux shape is changed to the saddle type one. The current model gave some error in Root Mean Square (RMS) between the measured value and the calculated one. The virtual segmentation model gave the better agreement at that time

  14. The lack of foundation in the mechanism on which are based the physico-chemical theories for the origin of the genetic code is counterposed to the credible and natural mechanism suggested by the coevolution theory.

    Science.gov (United States)

    Di Giulio, Massimo

    2016-06-21

    I analyze the mechanism on which are based the majority of theories that put to the center of the origin of the genetic code the physico-chemical properties of amino acids. As this mechanism is based on excessive mutational steps, I conclude that it could not have been operative or if operative it would not have allowed a full realization of predictions of these theories, because this mechanism contained, evidently, a high indeterminacy. I make that disapproving the four-column theory of the origin of the genetic code (Higgs, 2009) and reply to the criticism that was directed towards the coevolution theory of the origin of the genetic code. In this context, I suggest a new hypothesis that clarifies the mechanism by which the domains of codons of the precursor amino acids would have evolved, as predicted by the coevolution theory. This mechanism would have used particular elongation factors that would have constrained the evolution of all amino acids belonging to a given biosynthetic family to the progenitor pre-tRNA, that for first recognized, the first codons that evolved in a certain codon domain of a determined precursor amino acid. This happened because the elongation factors recognized two characteristics of the progenitor pre-tRNAs of precursor amino acids, which prevented the elongation factors from recognizing the pre-tRNAs belonging to biosynthetic families of different precursor amino acids. Finally, I analyze by means of Fisher's exact test, the distribution, within the genetic code, of the biosynthetic classes of amino acids and the ones of polarity values of amino acids. This analysis would seem to support the biosynthetic classes of amino acids over the ones of polarity values, as the main factor that led to the structuring of the genetic code, with the physico-chemical properties of amino acids playing only a subsidiary role in this evolution. As a whole, the full analysis brings to the conclusion that the coevolution theory of the origin of the

  15. Genetic Mimetics of Mycobacterium tuberculosis and Methicillin-Resistant Staphylococcus aureus as Verification Standards for Molecular Diagnostics.

    Science.gov (United States)

    Machowski, Edith Erika; Kana, Bavesh Davandra

    2017-12-01

    Molecular diagnostics have revolutionized the management of health care through enhanced detection of disease or infection and effective enrollment into treatment. In recognition of this, the World Health Organization approved the rollout of nucleic acid amplification technologies for identification of Mycobacterium tuberculosis using platforms such as GeneXpert MTB/RIF, the GenoType MTBDR plus line probe assay, and, more recently, GeneXpert MTB/RIF Ultra. These assays can simultaneously detect tuberculosis infection and assess rifampin resistance. However, their widespread use in health systems requires verification and quality assurance programs. To enable development of these, we report the construction of genetically modified strains of Mycobacterium smegmatis that mimic the profile of Mycobacterium tuberculosis on both the GeneXpert MTB/RIF and the MTBDR plus line probe diagnostic tests. Using site-specific gene editing, we also created derivatives that faithfully mimic the diagnostic result of rifampin-resistant M. tuberculosis , with mutations at positions 513, 516, 526, 531, and 533 in the rifampin resistance-determining region of the rpoB gene. Next, we extended this approach to other diseases and demonstrated that a Staphylococcus aureus gene sequence can be introduced into M. smegmatis to generate a positive response for the SCC mec probe in the GeneXpert SA Nasal Complete molecular diagnostic cartridge, designed for identification of methicillin-resistant S. aureus These biomimetic strains are cost-effective, have low biohazard content, accurately mimic drug resistance, and can be produced with relative ease, thus illustrating their potential for widespread use as verification standards for diagnosis of a variety of diseases. Copyright © 2017 American Society for Microbiology.

  16. A novel neutron energy spectrum unfolding code using particle swarm optimization

    International Nuclear Information System (INIS)

    Shahabinejad, H.; Sohrabpour, M.

    2017-01-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code. - Highlights: • Introducing a novel method for neutron spectrum unfolding. • Implementation of a particle swarm optimization code for neutron unfolding. • Comparing results of the PSO code with those of recently published TGASU code. • Match results of the PSO code with those of TGASU code. • Greater convergence rate of implemented PSO code than TGASU code.

  17. A genetic polymorphism in the coding region of the gastric intrinsic factor gene (GIF) is associated with congenital intrinsic factor deficiency.

    Science.gov (United States)

    Gordon, Marilyn M; Brada, Nancy; Remacha, Angel; Badell, Isabel; del Río, Elisabeth; Baiget, Montserrat; Santer, René; Quadros, Edward V; Rothenberg, Sheldon P; Alpers, David H

    2004-01-01

    Congenital intrinsic factor (IF) deficiency is a disorder characterized by megaloblastic anemia due to the absence of gastric IF (GIF, GenBank NM_005142) and GIF antibodies, with probable autosomal recessive inheritance. Most of the reported patients are isolated cases without genetic studies of the parents or siblings. Complete exonic sequences were determined from the PCR products generated from genomic DNA of five affected individuals. All probands had the identical variant (g.68A>G) in the second position of the fifth codon in the coding sequence of the gene that introduces a restriction enzyme site for Msp I and predicts a change in the mature protein from glutamine(5) (CAG) to arginine(5) (CGG). Three subjects were homozygous for this base exchange and two subjects were heterozygous, one of which was apparently a compound heterozygote at positions 1 and 2 of the fifth codon ([g.67C>G] + [g.68A>G]). The other patient, heterozygous for position 2, had one heterozygous unaffected parent. Most parents were heterozygous for this base exchange, confirming the pattern of autosomal recessive inheritance for congenital IF deficiency. cDNA encoding GIF was mutated at base pair g.68 (A>G) and expressed in COS-7 cells. The apparent size, secretion rate, and sensitivity to pepsin hydrolysis of the expressed IF were similar to native IF. The allelic frequency of g.68A>G was 0.067 and 0.038 in two control populations. This sequence aberration is not the cause of the phenotype, but is associated with the genotype of congenital IF deficiency and could serve as a marker for inheritance of this disorder. Copyright 2003 Wiley-Liss, Inc.

  18. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  19. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  20. Current and proposed revisions, changes, and modifications to American codes and standards to address packaging, handling, and transportation of radioactive materials and how they relate to comparable international regulations

    International Nuclear Information System (INIS)

    Borter, W.H.; Froehlich, C.H.

    2004-01-01

    This paper addresses current and proposed revisions, additions, and modifications to American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (BPVC) (i.e., ''ASMEthe Code'') Section III, Division 3 and American National Standards Institute (ANSI)/ASME N14.6. It provides insight into the ongoing processes of the associated committees and highlights important revisions, changes, and modifications to this Code and Standard. The ASME Code has developed and issued Division 3 to address items associated with the transportation and storage of radioactive materials. It currently only addresses ''General Requirements'' in Subsections WA and ''Class TP (Type B) Containments'' (Transportation Packages) in Subsection WB, but is in the process of adding a new Subsection WC to address ''Class SC'' (Storage Containments). ANSI/ASME Standard N14.6 which interacts with components constructed to Division 3 by addressinges special lifting devices for radioactive material shipping containers. This Standard is in the process of a complete re-write. This Code and Standard can be classified as ''dynamic'' in that their committees meet at least four times a year to evaluate proposed modifications and additions that reflect current safety practices in the nuclear industry. These evaluations include the possible addition of new materials, fabrication processes, examination methods, and testing requirements. An overview of this ongoing process is presented in this paper along with highlights of the more important proposed revisions, changes, and modifications and how they relate to United States (US) and international regulations and guidance like International Atomic Energy Agency (IAEA) Requirement No. TS-R-1

  1. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  2. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  3. CT dosimetry computer codes: Their influence on radiation dose estimates and the necessity for their revision under new ICRP radiation protection standards

    International Nuclear Information System (INIS)

    Kim, K. P.; Lee, J.; Bolch, W. E.

    2011-01-01

    Computed tomography (CT) dosimetry computer codes have been most commonly used due to their user friendliness, but with little consideration for potential uncertainty in estimated organ dose and their underlying limitations. Generally, radiation doses calculated with different CT dosimetry computer codes were comparable, although relatively large differences were observed for some specific organs or tissues. The largest difference in radiation doses calculated using different computer codes was observed for Siemens Sensation CT scanners. Radiation doses varied with patient age and sex. Younger patients and adult females receive a higher radiation dose in general than adult males for the same CT technique factors. There are a number of limitations of current CT dosimetry computer codes. These include unrealistic modelling of the human anatomy, a limited number of organs and tissues for dose calculation, inability to alter patient height and weight, and non-applicability to new CT technologies. Therefore, further studies are needed to overcome these limitations and to improve CT dosimetry. (authors)

  4. A novel pseudoderivative-based mutation operator for real-coded adaptive genetic algorithms [v2; ref status: indexed, http://f1000r.es/1td

    OpenAIRE

    Maxinder S Kanwal; Avinash S Ramesh; Lauren A Huang

    2013-01-01

    Recent development of large databases, especially those in genetics and proteomics, is pushing the development of novel computational algorithms that implement rapid and accurate search strategies. One successful approach has been to use artificial intelligence and methods, including pattern recognition (e.g. neural networks) and optimization techniques (e.g. genetic algorithms). The focus of this paper is on optimizing the design of genetic algorithms by using an adaptive mutation rate that ...

  5. Uganda; Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on the following topics: Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, and Payment Systems

    OpenAIRE

    International Monetary Fund

    2003-01-01

    This paper presents findings of Uganda’s Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, Insurance Regulation, Corporate Governance, and Payment Systems. The banking system in Uganda, which dominates the financial system, is fundamentally sound, more resilient than in the past, and currently poses no threat to macroeconomic stability. A major disruption ...

  6. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  7. A novel pseudoderivative-based mutation operator for real-coded adaptive genetic algorithms [v2; ref status: indexed, http://f1000r.es/1td

    Directory of Open Access Journals (Sweden)

    Maxinder S Kanwal

    2013-11-01

    Full Text Available Recent development of large databases, especially those in genetics and proteomics, is pushing the development of novel computational algorithms that implement rapid and accurate search strategies. One successful approach has been to use artificial intelligence and methods, including pattern recognition (e.g. neural networks and optimization techniques (e.g. genetic algorithms. The focus of this paper is on optimizing the design of genetic algorithms by using an adaptive mutation rate that is derived from comparing the fitness values of successive generations. We propose a novel pseudoderivative-based mutation rate operator designed to allow a genetic algorithm to escape local optima and successfully continue to the global optimum. Once proven successful, this algorithm can be implemented to solve real problems in neurology and bioinformatics. As a first step towards this goal, we tested our algorithm on two 3-dimensional surfaces with multiple local optima, but only one global optimum, as well as on the N-queens problem, an applied problem in which the function that maps the curve is implicit. For all tests, the adaptive mutation rate allowed the genetic algorithm to find the global optimal solution, performing significantly better than other search methods, including genetic algorithms that implement fixed mutation rates.

  8. Building Codes and Regulations.

    Science.gov (United States)

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  9. An approach based on genetic algorithms with coding in real for the solution of a DC OPF to hydrothermal systems; Uma abordagem baseada em algoritmos geneticos com codificacao em real para a solucao de um FPO DC para sistemas hidrotermicos

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, Diego R.; Silva, Alessandro L. da; Luciano, Edson Jose Rezende; Nepomuceno, Leonardo [Universidade Estadual Paulista (UNESP), Bauru, SP (Brazil). Dept. de Engenharia Eletrica], Emails: diego_eng.eletricista@hotmail.com, alessandrolopessilva@uol.com.br, edson.joserl@uol.com.br, leo@feb.unesp.br

    2009-07-01

    Problems of DC Optimal Power Flow (OPF) have been solved by various conventional optimization methods. When the modeling of DC OPF involves discontinuous functions or not differentiable, the use of solution methods based on conventional optimization is often not possible because of the difficulty in calculating the gradient vectors at points of discontinuity/non-differentiability of these functions. This paper proposes a method for solving the DC OPF based on Genetic Algorithms (GA) with real coding. The proposed GA has specific genetic operators to improve the quality and viability of the solution. The results are analyzed for an IEEE test system, and its solutions are compared, when possible, with those obtained by a method of interior point primal-dual logarithmic barrier. The results highlight the robustness of the method and feasibility of obtaining the solution to real systems.

  10. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  11. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  12. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  13. Energy Building Regulations: The Effect of the Federal Performance Standards on Building Code Administration and the Conservation of Energy in New Buildings.

    Science.gov (United States)

    Kopper, William D.

    1980-01-01

    Explores the changes in the administration and enforcement of building regulations that will be engendered by the proposed federal energy building standards. Also evaluates the effectiveness of those standards in meeting congressional intent. Available from U.C. Davis Law Review, School of Law, Martin Luther King Jr. Hall, University of…

  14. Genetics researchers’ and iRB professionals’ attitudes toward genetic research review: a comparative analysis

    Science.gov (United States)

    Edwards, Karen L.; Lemke, Amy A.; Trinidad, Susan B.; Lewis, Susan M.; Starks, Helene; Snapinn, Katherine W.; Griffin, Mary Quinn; Wiesner, Georgia L.; Burke, Wylie

    2012-01-01

    Purpose Genetic research involving human participants can pose challenging questions related to ethical and regulatory standards for research oversight. However, few empirical studies describe how genetic researchers and institutional review board (IRB) professionals conceptualize ethical issues in genetic research or where common ground might exist. Methods Parallel online surveys collected information from human genetic researchers (n = 351) and IRB professionals (n = 208) regarding their views about human participant oversight for genetic protocols. Results A range of opinions were observed within groups on most issues. In both groups, a minority thought it likely that people would be harmed by participation in genetic research or identified from coded genetic data. A majority of both groups agreed that reconsent should be required for four of the six scenarios presented. Statistically significant differences were observed between groups on some issues, with more genetic researcher respondents trusting the confidentiality of coded data, fewer expecting harms from reidentification, and fewer considering reconsent necessary in certain scenarios. Conclusions The range of views observed within and between IRB and genetic researcher groups highlights the complexity and unsettled nature of many ethical issues in genome research. Our findings also identify areas where researcher and IRB views diverge and areas of common ground. PMID:22241102

  15. Genetic diversity of the HLA-G coding region in Amerindian populations from the Brazilian Amazon: a possible role of natural selection.

    Science.gov (United States)

    Mendes-Junior, C T; Castelli, E C; Meyer, D; Simões, A L; Donadi, E A

    2013-12-01

    HLA-G has an important role in the modulation of the maternal immune system during pregnancy, and evidence that balancing selection acts in the promoter and 3'UTR regions has been previously reported. To determine whether selection acts on the HLA-G coding region in the Amazon Rainforest, exons 2, 3 and 4 were analyzed in a sample of 142 Amerindians from nine villages of five isolated tribes that inhabit the Central Amazon. Six previously described single-nucleotide polymorphisms (SNPs) were identified and the Expectation-Maximization (EM) and PHASE algorithms were used to computationally reconstruct SNP haplotypes (HLA-G alleles). A new HLA-G allele, which originated in Amerindian populations by a crossing-over event between two widespread HLA-G alleles, was identified in 18 individuals. Neutrality tests evidenced that natural selection has a complex part in the HLA-G coding region. Although balancing selection is the type of selection that shapes variability at a local level (Native American populations), we have also shown that purifying selection may occur on a worldwide scale. Moreover, the balancing selection does not seem to act on the coding region as strongly as it acts on the flanking regulatory regions, and such coding signature may actually reflect a hitchhiking effect.

  16. Building codes : obstacle or opportunity?

    Science.gov (United States)

    Alberto Goetzl; David B. McKeever

    1999-01-01

    Building codes are critically important in the use of wood products for construction. The codes contain regulations that are prescriptive or performance related for various kinds of buildings and construction types. A prescriptive standard might dictate that a particular type of material be used in a given application. A performance standard requires that a particular...

  17. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  18. Error-Detecting Identification Codes for Algebra Students.

    Science.gov (United States)

    Sutherland, David C.

    1990-01-01

    Discusses common error-detecting identification codes using linear algebra terminology to provide an interesting application of algebra. Presents examples from the International Standard Book Number, the Universal Product Code, bank identification numbers, and the ZIP code bar code. (YP)

  19. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  20. Genetic variants in promoters and coding regions of the muscle glycogen synthase and the insulin-responsive GLUT4 genes in NIDDM

    DEFF Research Database (Denmark)

    Bjørbaek, C; Echwald, Søren Morgenthaler; Hubricht, P

    1994-01-01

    To examine the hypothesis that variants in the regulatory or coding regions of the glycogen synthase (GS) and insulin-responsive glucose transporter (GLUT4) genes contribute to insulin-resistant glucose processing of muscle from non-insulin-dependent diabetes mellitus (NIDDM) patients, promoter...... volunteers. By applying inverse polymerase chain reaction and direct DNA sequencing, 532 base pairs (bp) of the GS promoter were identified and the transcriptional start site determined by primer extension. SSCP scanning of the promoter region detected five single nucleotide substitutions, positioned at 42......'-untranslated region, and the coding region of the GLUT4 gene showed four polymorphisms, all single nucleotide substitutions, positioned at -581, 1, 30, and 582. None of the three changes in the regulatory region of the gene had any major influence on expression of the GLUT4 gene in muscle. The variant at 582...

  1. Revision of the recommended international general standard for irradiated foods and of the recommended international code of practice for the operation of radiation facilities used for the treatment of foods

    International Nuclear Information System (INIS)

    1981-11-01

    In view of the findings and statements of the Joint FAO/IAEA/WHO Expert Committee on the Wholesomeness of Irradiated Food, convened in Geneva from 27 October to 3 November 1980, a Consultation Group, convened in Geneva from 1 to 3 July 1981 suggested the revision of the Recommended International General Standard for Irradiated Foods and of the Recommended International Code of Practice for the Operation of Radiation Facilities. The proposed changes are given and justified and the revised wording of the documents presented

  2. Image Steganography In Securing Sound File Using Arithmetic Coding Algorithm, Triple Data Encryption Standard (3DES) and Modified Least Significant Bit (MLSB)

    Science.gov (United States)

    Nasution, A. B.; Efendi, S.; Suwilo, S.

    2018-04-01

    The amount of data inserted in the form of audio samples that use 8 bits with LSB algorithm, affect the value of PSNR which resulted in changes in image quality of the insertion (fidelity). So in this research will be inserted audio samples using 5 bits with MLSB algorithm to reduce the number of data insertion where previously the audio sample will be compressed with Arithmetic Coding algorithm to reduce file size. In this research will also be encryption using Triple DES algorithm to better secure audio samples. The result of this research is the value of PSNR more than 50dB so it can be concluded that the image quality is still good because the value of PSNR has exceeded 40dB.

  3. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  4. Validation of a single nucleotide polymorphism (SNP) typing assay with 49 SNPs for forensic genetic testing in a laboratory accredited according to the ISO 17025 standard

    DEFF Research Database (Denmark)

    Børsting, Claus; Rockenbauer, Eszter; Morling, Niels

    2009-01-01

    cases and 33 twin cases were typed at least twice for the 49 SNPs. All electropherograms were analysed independently by two expert analysts prior to approval. Based on these results, detailed guidelines for analysis of the SBE products were developed. With these guidelines, the peak height ratio...... of a heterozygous allele call or the signal to noise ratio of a homozygous allele call is compared with previously obtained ratios. A laboratory protocol for analysis of SBE products was developed where allele calls with unusual ratios were highlighted to facilitate the analysis of difficult allele calls......A multiplex assay with 49 autosomal single nucleotide polymorphisms (SNPs) developed for human identification was validated for forensic genetic casework and accredited according to the ISO 17025 standard. The multiplex assay was based on the SNPforID 52plex SNP assay [J.J. Sanchez, C. Phillips, C...

  5. Optimization of Saanen sperm genes amplification: evaluation of standardized protocols in genetically uncharacterized rural goats reared under a subtropical environment.

    Science.gov (United States)

    Barbour, Elie K; Saade, Maya F; Sleiman, Fawwak T; Hamadeh, Shady K; Mouneimne, Youssef; Kassaifi, Zeina; Kayali, Ghazi; Harakeh, Steve; Jaber, Lina S; Shaib, Houssam A

    2012-10-01

    The purpose of this research is to optimize quantitatively the amplification of specific sperm genes in reference genomically characterized Saanen goat and to evaluate the standardized protocols applicability on sperms of uncharacterized genome of rural goats reared under subtropical environment for inclusion in future selection programs. The optimization of the protocols in Saanen sperms included three production genes (growth hormone (GH) exons 2, 3, and 4, αS1-casein (CSN1S1), and α-lactalbumin) and two health genes (MHC class II DRB and prion (PrP)). The optimization was based on varying the primers concentrations and the inclusion of a PCR cosolvent (Triton X). The impact of the studied variables on statistically significant increase in the yield of amplicons was noticed in four out of five (80%) optimized protocols, namely in those related to GH, CSN1S1, α-lactalbumin, and PrP genes (P 0.05). The applicability of the optimized protocols of Saanen sperm genes on amplification of uncharacterized rural goat sperms revealed a 100% success in tested individuals for amplification of GH, CSN1S1, α-lactalbumin, and MHC class II DRB genes and a 75% success for the PrP gene. The significant success in applicability of the Saanen quantitatively optimized protocols to other uncharacterized genome of rural goats allows for their inclusion in future selection, targeting the sustainability of this farming system in a subtropical environment and the improvement of the farmers livelihood.

  6. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  7. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  8. 75 FR 20833 - Building Energy Codes

    Science.gov (United States)

    2010-04-21

    ...-0012] Building Energy Codes AGENCY: Office of Energy Efficiency and Renewable Energy, Department of... the current model building energy codes or their equivalent. DOE is interested in better understanding... codes, Standard 90.1-2007, Energy Standard for Buildings Except Low-Rise Residential Buildings (or...

  9. Use of a standardized JaCVAM in vivo rat comet assay protocol to assess the genotoxicity of three coded test compounds; ampicillin trihydrate, 1,2-dimethylhydrazine dihydrochloride, and N-nitrosodimethylamine.

    Science.gov (United States)

    McNamee, J P; Bellier, P V

    2015-07-01

    As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  10. Decoding the codes: A content analysis of the news coverage of genetic cloning by three online news sites and three national daily newspapers, 1996 through 1998

    Science.gov (United States)

    Hyde, Jon E.

    This study compared news coverage of genetic cloning research in three online news sites (CNN.com, ABC.com, and MSNBC.com) and three national daily newspapers (The New York Times, The Washington Post, and USA Today). The study involved the analysis of 230 online and print news articles concerning genetic cloning published from 1996 through 1998. Articles were examined with respect to formats, sources, focus, tone, and assessments about the impact of cloning research. Findings indicated that while print news formats remained relatively constant for the duration of this study, online news formats changed significantly with respect to the kinds of media used to represent the news, the layouts used to represent cloning news, and the emphasis placed on audio-visual content. Online stories were as much as 20 to 70% shorter than print stories. More than 50% of the articles appearing online were composed by outside sources (wire services, guest columnists, etc.). By comparison, nearly 90% of the articles published by print newspapers were written "in-house" by science reporters. Online news sites cited fewer sources and cited a smaller variety of sources than the newspapers examined here. In both news outlets, however, the sources most frequently cited were those with vested interests in furthering cloning research. Both online and print news coverage of cloning tends to focus principally on the technical procedures and on the future benefits of cloning. More than 60% of the articles focused on the techniques and technologies of cloning. Less than 25% of the articles focused on social, ethical, or legal issues associated with cloning. Similarly, articles from all six sources (75%) tended to be both positive and future-oriented. Less than 5% of the total articles examined here had a strongly negative or critical tone. Moreover, both online and print news sources increasingly conveyed a strong sense of acceptance about the possibility of human cloning. Data from this study

  11. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  12. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  13. Coding-Spreading Tradeoff in CDMA Systems

    National Research Council Canada - National Science Library

    Bolas, Eduardo

    2002-01-01

    .... Comparing different combinations of coding and spreading with a traditional DS-CDMA, as defined in the IS-95 standard, allows the criteria to be defined for the best coding-spreading tradeoff in CDMA systems...

  14. Genetic classes and genetic categories : Protecting genetic groups through data protection law

    NARCIS (Netherlands)

    Hallinan, Dara; de Hert, Paul; Taylor, L.; Floridi, L.; van der Sloot, B.

    2017-01-01

    Each person shares genetic code with others. Thus, one individual’s genome can reveal information about other individuals. When multiple individuals share aspects of genetic architecture, they form a ‘genetic group’. From a social and legal perspective, two types of genetic group exist: Those which

  15. A two warehouse deterministic inventory model for deteriorating items with a linear trend in time dependent demand over finite time horizon by Elitist Real-Coded Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    A.K. Bhunia

    2013-04-01

    Full Text Available This paper deals with a deterministic inventory model developed for deteriorating items having two separate storage facilities (owned and rented warehouses due to limited capacity of the existing storage (owned warehouse with linear time dependent demand (increasing over a fixed finite time horizon. The model is formulated with infinite replenishment and the successive replenishment cycle lengths are in arithmetic progression. Partially backlogged shortages are allowed. The stocks of rented warehouse (RW are transported to the owned warehouse (OW in continuous release pattern. For this purpose, the model is formulated as a constrained non-linear mixed integer programming problem. For solving the problem, an advanced genetic algorithm (GA has been developed. This advanced GA is based on ranking selection, elitism, whole arithmetic crossover and non-uniform mutation dependent on the age of the population. Our objective is to determine the optimal replenishment number, lot-size of two-warehouses (OW and RW by maximizing the profit function. The model is illustrated with four numerical examples and sensitivity analyses of the optimal solution are performed with respect to different parameters.

  16. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  17. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  18. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  19. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  20. Overview of North American Hydrogen Sensor Standards

    Energy Technology Data Exchange (ETDEWEB)

    O' Malley, Kathleen [SRA International, Inc., Colorado Springs, CO (United States); Lopez, Hugo [UL LLC, Chicago, IL (United States); Cairns, Julie [CSA Group, Cleveland, OH (United States); Wichert, Richard [Professional Engineering, Inc.. Citrus Heights, CA (United States); Rivkin, Carl [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Burgess, Robert [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Buttner, William [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-08-11

    An overview of the main North American codes and standards associated with hydrogen safety sensors is provided. The distinction between a code and a standard is defined, and the relationship between standards and codes is clarified, especially for those circumstances where a standard or a certification requirement is explicitly referenced within a code. The report identifies three main types of standards commonly applied to hydrogen sensors (interface and controls standards, shock and hazard standards, and performance-based standards). The certification process and a list and description of the main standards and model codes associated with the use of hydrogen safety sensors in hydrogen infrastructure are presented.

  1. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  2. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  3. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  4. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  5. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  6. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  7. Dealing with an Unconventional Genetic Code in  Mitochondria: The Biogenesis and Pathogenic  Defects of the 5‐Formylcytosine Modification in  Mitochondrial tRNAMet

    Directory of Open Access Journals (Sweden)

    Lindsey Van Haute

    2017-03-01

    Full Text Available Human mitochondria contain their own genome, which uses an unconventional genetic code. In addition to the standard AUG methionine codon, the single mitochondrial tRNA Methionine (mt‐tRNAMet also recognises AUA during translation initiation and elongation. Post‐transcriptional modifications of tRNAs are important for structure, stability, correct folding and aminoacylation as well as decoding. The unique 5‐formylcytosine (f5C modification of position 34 in mt‐tRNAMet has been long postulated to be crucial for decoding of unconventional methionine codons and efficient mitochondrial translation. However, the enzymes responsible for the formation of mitochondrial f5C have been identified only recently. The first step of the f5C pathway consists of methylation of cytosine by NSUN3. This is followed by further oxidation by ABH1. Here, we review the role of f5C, the latest breakthroughs in our understanding of the biogenesis of this unique mitochondrial tRNA modification and its involvement in human disease.

  8. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  9. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  10. The International Standards Organisation offshore structures standard

    International Nuclear Information System (INIS)

    Snell, R.O.

    1994-01-01

    The International Standards Organisation has initiated a program to develop a suite of ISO Codes and Standards for the Oil Industry. The Offshore Structures Standard is one of seven topics being addressed. The scope of the standard will encompass fixed steel and concrete structures, floating structures, Arctic structures and the site specific assessment of mobile drilling and accommodation units. The standard will use as base documents the existing recommended practices and standards most frequently used for each type of structure, and will develop them to incorporate best published and recognized practice and knowledge where it provides a significant improvement on the base document. Work on the Code has commenced under the direction of an internationally constituted sub-committee comprising representatives from most of the countries with a substantial offshore oil and gas industry. This paper outlines the background to the code and the format, content and work program

  11. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  12. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  13. Genetic algorithms used for PWRs refuel management automatic optimization: a new modelling

    International Nuclear Information System (INIS)

    Chapot, Jorge Luiz C.; Schirru, Roberto; Silva, Fernando Carvalho da

    1996-01-01

    A Genetic Algorithms-based system, linking the computer codes GENESIS 5.0 and ANC through the interface ALGER, has been developed aiming the PWRs fuel management optimization. An innovative codification, the Lists Model, has been incorporated to the genetic system, which avoids the use of variants of the standard crossover operator and generates only valid loading patterns in the core. The GENESIS/ALGER/ANC system has been successfully tested in an optimization study for Angra-1 second cycle. (author)

  14. Influence of genetic polymorphisms on the effect of high- and standard-dose clopidogrel after percutaneous coronary intervention: the GIFT (Genotype Information and Functional Testing) study.

    Science.gov (United States)

    Price, Matthew J; Murray, Sarah S; Angiolillo, Dominick J; Lillie, Elizabeth; Smith, Erin N; Tisch, Rebecca L; Schork, Nicholas J; Teirstein, Paul S; Topol, Eric J

    2012-05-29

    This study sought to evaluate the influence of single nucleotide polymorphisms (SNPs) on the pharmacodynamic effect of high- or standard-dose clopidogrel after percutaneous coronary intervention (PCI). There is a lack of prospective, multicenter data regarding the effect of different genetic variants on clopidogrel pharmacodynamics over time in patients undergoing PCI. The GRAVITAS (Gauging Responsiveness with A VerifyNow assay-Impact on Thrombosis And Safety) trial screened patients with platelet function testing after PCI and randomly assigned those with high on-treatment reactivity (OTR) to either high- or standard-dose clopidogrel; a cohort of patients without high OTR were also followed. DNA samples obtained from 1,028 patients were genotyped for 41 SNPs in 17 genes related to platelet reactivity. After adjusting for clinical characteristics, the associations between the SNPs and OTR using linear regression were evaluated. CYP2C19*2 was significantly associated with OTR at 12 to 24 h (R(2) = 0.07, p = 2.2 × 10(-15)), 30 days (R(2) = 0.10, p = 1.3 × 10(-7)), and 6 months after PCI (R(2) = 0.07, p = 1.9 × 10(-11)), whereas PON1, ABCB1 3435 C→T, and other candidate SNPs were not. Carriers of 1 and 2 reduced-function CYP2C19 alleles were significantly more likely to display persistently high OTR at 30 days and 6 months, irrespective of treatment assignment. The portion of the risk of persistently high OTR at 30 days attributable to reduced-function CYP2C19 allele carriage was 5.2% in the patients randomly assigned to high-dose clopidogrel. CYP2C19, but not PON1 or ABCB1, is a significant determinant of the pharmacodynamic effects of clopidogrel, both early and late after PCI. In patients with high OTR identified by platelet function testing, the CYP2C19 genotype provides limited incremental information regarding the risk of persistently high reactivity with clopidogrel 150-mg maintenance dosing. (Genotype Information and Functional Testing Study [GIFT]; NCT

  15. Video Coding Technique using MPEG Compression Standards

    African Journals Online (AJOL)

    Akorede

    The two dimensional discrete cosine transform (2-D DCT) is an integral part of video and image compression ... solution for the optimum trade-off by applying rate-distortion theory has been ..... Int. J. the computer, the internet and management,.

  16. NRL Radar Division C++ Coding Standard

    Science.gov (United States)

    2016-12-05

    files, sockets) the generated member functions have probably the wrong behavior and must be implemented. You have to decide if the resources pointed to...much more difficult to maintain, you should avoid it. Source CERN. CA10 Do not use asm (the assembler macro facility of C++). Source CERN. 31 CA11

  17. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  18. Consent Codes: Upholding Standard Data Use Conditions.

    Directory of Open Access Journals (Sweden)

    Stephanie O M Dyke

    2016-01-01

    Full Text Available A systematic way of recording data use conditions that are based on consent permissions as found in the datasets of the main public genome archives (NCBI dbGaP and EMBL-EBI/CRG EGA.

  19. Use of Contemporary Genetics in Cardiovascular Diagnosis

    Science.gov (United States)

    George, Alfred L.

    2015-01-01

    An explosion of knowledge regarding the genetic and genomic basis for rare and common diseases has provided a framework for revolutionizing the practice of medicine. Achieving the reality of a genomic medicine era requires that basic discoveries are effectively translated into clinical practice through implementation of genetic and genomic testing. Clinical genetic tests have become routine for many inherited disorders and can be regarded as the standard-of-care in many circumstances including disorders affecting the cardiovascular system. New, high-throughput methods for determining the DNA sequence of all coding exons or complete genomes are being adopted for clinical use to expand the speed and breadth of genetic testing. Along with these extraordinary advances have emerged new challenges to practicing physicians for understanding when and how to use genetic testing along with how to appropriately interpret test results. This review will acquaint readers with general principles of genetic testing including newer technologies, test interpretation and pitfalls. The focus will be on testing genes responsible for monogenic disorders and on other emerging applications such as pharmacogenomic profiling. The discussion will be extended to the new paradigm of direct-to-consumer genetic testing and the value of assessing genomic risk for common diseases. PMID:25421045

  20. The Coding Causes of Death in HIV (CoDe) Project: initial results and evaluation of methodology

    DEFF Research Database (Denmark)

    Kowalska, Justyna D; Friis-Møller, Nina; Kirk, Ole

    2011-01-01

    The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies.......The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies....