WorldWideScience

Sample records for specificity determines coding

  1. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  2. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  3. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  4. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  5. Identification of the two rotavirus genes determining neutralization specificities

    International Nuclear Information System (INIS)

    Offit, P.A.; Blavat, G.

    1986-01-01

    Bovine rotavirus NCDV and simian rotavirus SA-11 represent two distinct rotavirus serotypes. A genetic approach was used to determine which viral gene segments segregated with serotype-specific viral neutralization. There were 16 reassortant rotarviruses derived by coinfection of MA-104 cells in vitro with the SA-11 and NCDV strains. The parental origin of reassortant rotavirus double-stranded RNA segments was determined by gene segment mobility in polyacrylamide gels and by hybridization with radioactively labeled parental viral transcripts. The authors found that two rotavirus gene segments found previously to code for outer capsid proteins vp3 and vp7 cosegreated with virus neutralization specificities

  6. Identification of the two rotavirus genes determining neutralization specificities

    Energy Technology Data Exchange (ETDEWEB)

    Offit, P.A.; Blavat, G.

    1986-01-01

    Bovine rotavirus NCDV and simian rotavirus SA-11 represent two distinct rotavirus serotypes. A genetic approach was used to determine which viral gene segments segregated with serotype-specific viral neutralization. There were 16 reassortant rotarviruses derived by coinfection of MA-104 cells in vitro with the SA-11 and NCDV strains. The parental origin of reassortant rotavirus double-stranded RNA segments was determined by gene segment mobility in polyacrylamide gels and by hybridization with radioactively labeled parental viral transcripts. The authors found that two rotavirus gene segments found previously to code for outer capsid proteins vp3 and vp7 cosegreated with virus neutralization specificities.

  7. Mother code specifications (Appendix to CEA report 2472)

    International Nuclear Information System (INIS)

    Pillard, Denise; Soule, Jean-Louis

    1964-12-01

    The Mother code (written in Fortran for IBM 7094) computes the integral cross section and the first two moments of energy transfer of a thermalizer. Computation organisation and methods are presented in an other document. This document presents code specifications, i.e. input data (for spectrum description, printing options, input record formats, conditions to be met by values), and results (printing formats and options, writing and punching options and formats)

  8. An ancient neurotrophin receptor code; a single Runx/Cbfβ complex determines somatosensory neuron fate specification in zebrafish.

    Science.gov (United States)

    Gau, Philia; Curtright, Andrew; Condon, Logan; Raible, David W; Dhaka, Ajay

    2017-07-01

    In terrestrial vertebrates such as birds and mammals, neurotrophin receptor expression is considered fundamental for the specification of distinct somatosensory neuron types where TrkA, TrkB and TrkC specify nociceptors, mechanoceptors and proprioceptors/mechanoceptors, respectively. In turn, Runx transcription factors promote neuronal fate specification by regulating neurotrophin receptor and sensory receptor expression where Runx1 mediates TrkA+ nociceptor diversification while Runx3 promotes a TrkC+ proprioceptive/mechanoceptive fate. Here, we report in zebrafish larvae that orthologs of the neurotrophin receptors in contrast to terrestrial vertebrates mark overlapping and distinct subsets of nociceptors suggesting that TrkA, TrkB and TrkC do not intrinsically promote nociceptor, mechanoceptor and proprioceptor/mechanoceptor neuronal fates, respectively. While we find that zebrafish Runx3 regulates nociceptors in contrast to terrestrial vertebrates, it shares a conserved regulatory mechanism found in terrestrial vertebrate proprioceptors/mechanoceptors in which it promotes TrkC expression and suppresses TrkB expression. We find that Cbfβ, which enhances Runx protein stability and affinity for DNA, serves as an obligate cofactor for Runx in neuronal fate determination. High levels of Runx can compensate for the loss of Cbfβ, indicating that in this context Cbfβ serves solely as a signal amplifier of Runx activity. Our data suggests an alteration/expansion of the neurotrophin receptor code of sensory neurons between larval teleost fish and terrestrial vertebrates, while the essential roles of Runx/Cbfβ in sensory neuron cell fate determination while also expanded are conserved.

  9. Comparison of three gamma ray isotopic determination codes: FRAM, MGA, and TRIFID

    International Nuclear Information System (INIS)

    Cremers, T.L.; Malcom, J.E.; Bonner, C.A.

    1994-01-01

    The determination of the isotopic distribution of plutonium and the americium concentration is required for the assay of nuclear material by calorimetry or neutron coincidence counting. The isotopic information is used in calorimetric assay to compute the effective specific power from the measured isotopic fractions and the known specific power of each isotope. The effective specific power is combined with the heat measurement to obtain the mass of plutonium in the assayed nuclear material. The response of neutron coincidence counters is determined by the 240 Pu isotopic fraction with contributions from the other even plutonium isotopes. The effect of the 240 Pu isotopic fraction and the other neutron contributing isotopes are combined as 240 Pu effective. This is used to calculate the mass of nuclear material from the neutron counting data in a manner analogous to the effective specific power in calorimeter. Comparisons of the precision and accuracy of calorimetric assay and neutron coincidence counting often focus only on the precision and accuracy of the heat measurement (calorimetry) compared to the precision and accuracy of the neutron coincidence counting statistics. The major source of uncertainty for both calorimetric assay and neutron coincidence counting often lies in the determination of the plutonium isotopic distribution ad determined by gamma ray spectroscopy. Thus, the selection of the appropriate isotopic distribution code is of paramount importance to good calorimetric assay and neutron coincidence counting. Three gamma ray isotopic distribution codes, FRAM, MGA, and TRIFID have been compared at the Los Alamos Plutonium Facility under carefully controlled conditions of similar count rates, count times, and 240 Pu isotopic fraction

  10. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  11. FDA Developments: Food Code 2013 and Proposed Trans Fat Determination

    NARCIS (Netherlands)

    Grossman, M.R.

    2014-01-01

    268 Reports EFFL 4|2014 USA FDA Developments: Food Code 2013 and Proposed Trans Fat Determination Margaret Rosso Grossman* I. Food Code 2013 and Food Code Reference System Since 1993, the US Food and Drug Administration has published a Food Code, now updated every four years. In November 2013, the

  12. Determination of national midwifery ethical values and ethical codes: in Turkey.

    Science.gov (United States)

    Ergin, Ayla; Özcan, Müesser; Acar, Zeynep; Ersoy, Nermin; Karahan, Nazan

    2013-11-01

    It is important to define and practice ethical rules and codes for professionalisation. Several national and international associations have determined midwifery ethical codes. In Turkey, ethical rules and codes that would facilitate midwifery becoming professionalised have not yet been determined. This study was planned to contribute to the professionalisation of midwifery by determining national ethical values and codes. A total of 1067 Turkish midwives completed the survey. The most prevalent values of Turkish midwives were care for mother-child health, responsibility and professional adequacy. The preferred professional codes chosen by Turkish midwives were absence of conflicts of interest, respect for privacy, avoidance of deception, reporting of faulty practices, consideration of mothers and newborns as separate beings and prevention of harm. In conclusion, cultural values, beliefs and expectations of society cannot be underestimated, although the international professional values and codes of ethics contribute significantly to professionalisation of the midwifery profession.

  13. Statistical methods for accurately determining criticality code bias

    International Nuclear Information System (INIS)

    Trumble, E.F.; Kimball, K.D.

    1997-01-01

    A system of statistically treating validation calculations for the purpose of determining computer code bias is provided in this paper. The following statistical treatments are described: weighted regression analysis, lower tolerance limit, lower tolerance band, and lower confidence band. These methods meet the criticality code validation requirements of ANS 8.1. 8 refs., 5 figs., 4 tabs

  14. Interoperable domain-specific languages families for code generation

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Plášil, F.; Bureš, Tomáš; Hnětynka, P.

    2013-01-01

    Roč. 43, č. 5 (2013), s. 479-499 ISSN 0038-0644 R&D Projects: GA ČR GD201/09/H057 EU Projects: European Commission(XE) ASCENS 257414 Grant - others:GA AV ČR(CZ) GAP103/11/1489 Program:FP7 Institutional research plan: CEZ:AV0Z10300504 Keywords : code generation * domain specific languages * models reuse * extensible languages * specification * program synthesis Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.148, year: 2013

  15. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  16. Design specifications for ASME B and PV Code Section III nuclear class 1 piping

    International Nuclear Information System (INIS)

    Richardson, J.A.

    1978-01-01

    ASME B and PV Code Section III code regulations for nuclear piping requires that a comprehensive Design Specification be developed for ensuring that the design and installation of the piping meets all code requirements. The intent of this paper is to describe the code requirements, discuss the implementation of these requirements in a typical Class 1 piping design specification, and to report on recent piping failures in operating light water nuclear power plants in the US. (author)

  17. Building codes: An often overlooked determinant of health.

    Science.gov (United States)

    Chauvin, James; Pauls, Jake; Strobl, Linda

    2016-05-01

    Although the vast majority of the world's population spends most of their time in buildings, building codes are not often thought of as 'determinants of health'. The standards that govern the design, construction, and use of buildings affect our health, security, safety, and well-being. This is true for dwellings, schools, and universities, shopping centers, places of recreation, places of worship, health-care facilities, and workplaces. We urge proactive engagement by the global public health community in developing these codes, and in the design and implementation of health protection and health promotion activities intended to reduce the risk of injury, disability, and death, particularly when due to poor building code adoption/adaption, application, and enforcement.

  18. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  19. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  20. Determining the appropriate code in a South African business ...

    African Journals Online (AJOL)

    Determining the appropriate code in a South African business environment. ... Southern African Linguistics and Applied Language Studies ... would be perceived to enhance the quality of the interaction between client and service provider.

  1. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  2. The computer code SEURBNUK/EURDYN (Release 1). Input and output specification

    International Nuclear Information System (INIS)

    Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK/EURODYN is an extension of SEURBNUK-2, a two dimensional, axisymmetric, Eulerian, finite element containment code in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin and thick structures. These codes are designed to model the hydrodynamic development in time of a hypothetical core disruptive accident (HCDA) in a fast breeder reactor. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations, with information on output facilities, and aid to users to avoid some common difficulties. (UK)

  3. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Moses, Alan M.; Zhang, Zhaolei

    2015-01-01

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  4. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun

    2015-11-02

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  5. Product code optimization for determinate state LDPC decoding in robust image transmission.

    Science.gov (United States)

    Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G

    2006-08-01

    We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission.

  6. A human-specific de novo protein-coding gene associated with human brain functions.

    Directory of Open Access Journals (Sweden)

    Chuan-Yun Li

    2010-03-01

    Full Text Available To understand whether any human-specific new genes may be associated with human brain functions, we computationally screened the genetic vulnerable factors identified through Genome-Wide Association Studies and linkage analyses of nicotine addiction and found one human-specific de novo protein-coding gene, FLJ33706 (alternative gene symbol C20orf203. Cross-species analysis revealed interesting evolutionary paths of how this gene had originated from noncoding DNA sequences: insertion of repeat elements especially Alu contributed to the formation of the first coding exon and six standard splice junctions on the branch leading to humans and chimpanzees, and two subsequent substitutions in the human lineage escaped two stop codons and created an open reading frame of 194 amino acids. We experimentally verified FLJ33706's mRNA and protein expression in the brain. Real-Time PCR in multiple tissues demonstrated that FLJ33706 was most abundantly expressed in brain. Human polymorphism data suggested that FLJ33706 encodes a protein under purifying selection. A specifically designed antibody detected its protein expression across human cortex, cerebellum and midbrain. Immunohistochemistry study in normal human brain cortex revealed the localization of FLJ33706 protein in neurons. Elevated expressions of FLJ33706 were detected in Alzheimer's brain samples, suggesting the role of this novel gene in human-specific pathogenesis of Alzheimer's disease. FLJ33706 provided the strongest evidence so far that human-specific de novo genes can have protein-coding potential and differential protein expression, and be involved in human brain functions.

  7. Disease-Specific Trends of Comorbidity Coding and Implications for Risk Adjustment in Hospital Administrative Data.

    Science.gov (United States)

    Nimptsch, Ulrike

    2016-06-01

    To investigate changes in comorbidity coding after the introduction of diagnosis related groups (DRGs) based prospective payment and whether trends differ regarding specific comorbidities. Nationwide administrative data (DRG statistics) from German acute care hospitals from 2005 to 2012. Observational study to analyze trends in comorbidity coding in patients hospitalized for common primary diseases and the effects on comorbidity-related risk of in-hospital death. Comorbidity coding was operationalized by Elixhauser diagnosis groups. The analyses focused on adult patients hospitalized for the primary diseases of heart failure, stroke, and pneumonia, as well as hip fracture. When focusing the total frequency of diagnosis groups per record, an increase in depth of coding was observed. Between-hospital variations in depth of coding were present throughout the observation period. Specific comorbidity increases were observed in 15 of the 31 diagnosis groups, and decreases in comorbidity were observed for 11 groups. In patients hospitalized for heart failure, shifts of comorbidity-related risk of in-hospital death occurred in nine diagnosis groups, in which eight groups were directed toward the null. Comorbidity-adjusted outcomes in longitudinal administrative data analyses may be biased by nonconstant risk over time, changes in completeness of coding, and between-hospital variations in coding. Accounting for such issues is important when the respective observation period coincides with changes in the reimbursement system or other conditions that are likely to alter clinical coding practice. © Health Research and Educational Trust.

  8. Code-specific learning rules improve action selection by populations of spiking neurons.

    Science.gov (United States)

    Friedrich, Johannes; Urbanczik, Robert; Senn, Walter

    2014-08-01

    Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.

  9. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  10. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  11. Evidence for gene-specific rather than transcription rate-dependent histone H3 exchange in yeast coding regions.

    Science.gov (United States)

    Gat-Viks, Irit; Vingron, Martin

    2009-02-01

    In eukaryotic organisms, histones are dynamically exchanged independently of DNA replication. Recent reports show that different coding regions differ in their amount of replication-independent histone H3 exchange. The current paradigm is that this histone exchange variability among coding regions is a consequence of transcription rate. Here we put forward the idea that this variability might be also modulated in a gene-specific manner independently of transcription rate. To that end, we study transcription rate-independent replication-independent coding region histone H3 exchange. We term such events relative exchange. Our genome-wide analysis shows conclusively that in yeast, relative exchange is a novel consistent feature of coding regions. Outside of replication, each coding region has a characteristic pattern of histone H3 exchange that is either higher or lower than what was expected by its RNAPII transcription rate alone. Histone H3 exchange in coding regions might be a way to add or remove certain histone modifications that are important for transcription elongation. Therefore, our results that gene-specific coding region histone H3 exchange is decoupled from transcription rate might hint at a new epigenetic mechanism of transcription regulation.

  12. Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.

    Science.gov (United States)

    Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R

    2006-02-28

    The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.

  13. Design implications for task-specific search utilities for retrieval and re-engineering of code

    Science.gov (United States)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  14. The PP1 binding code: a molecular-lego strategy that governs specificity.

    Science.gov (United States)

    Heroes, Ewald; Lesage, Bart; Görnemann, Janina; Beullens, Monique; Van Meervelt, Luc; Bollen, Mathieu

    2013-01-01

    Ser/Thr protein phosphatase 1 (PP1) is a single-domain hub protein with nearly 200 validated interactors in vertebrates. PP1-interacting proteins (PIPs) are ubiquitously expressed but show an exceptional diversity in brain, testis and white blood cells. The binding of PIPs is mainly mediated by short motifs that dock to surface grooves of PP1. Although PIPs often contain variants of the same PP1 binding motifs, they differ in the number and combination of docking sites. This molecular-lego strategy for binding to PP1 creates holoenzymes with unique properties. The PP1 binding code can be described as specific, universal, degenerate, nonexclusive and dynamic. PIPs control associated PP1 by interference with substrate recruitment or access to the active site. In addition, some PIPs have a subcellular targeting domain that promotes dephosphorylation by increasing the local concentration of PP1. The diversity of the PP1 interactome and the properties of the PP1 binding code account for the exquisite specificity of PP1 in vivo. © 2012 The Authors Journal compilation © 2012 FEBS.

  15. [How can the coding quality in the DRG system be determined?].

    Science.gov (United States)

    Kahlmeyer, A; Volkmer, B

    2014-01-01

    The permanent adjustments ​​since 2003 to the G-DRG system have made the system even less understandable, so that many users have the feeling of feeding data into a black box which gives them a result without them being able to actively use the system itself. While chief physicians, senior physicians, and nursing managers are responsible to management for the results of the billing, they are in most cases not involved in the steps of DRG coding and billing. From this situation, a common question arises: "How well does my department code?" This uncertainty is exploited by many commercial vendors, who offer a wide variety of approaches for DRG optimization. The goal of this work is to provide advice as to how coding quality can be determined.

  16. STEEP4 code for computation of specific thermonuclear reaction rates from pointwise cross sections

    International Nuclear Information System (INIS)

    Harris, D.R.; Dei, D.E.; Husseiny, A.A.; Sabri, Z.A.; Hale, G.M.

    1976-05-01

    A code module, STEEP4, is developed to calculate the fusion reaction rates in terms of the specific reactivity [sigma v] which is the product of cross section and relative velocity averaged over the actual ion distributions of the interacting particles in the plasma. The module is structured in a way suitable for incorporation in thermonuclear burn codes to provide rapid and yet relatively accurate on-line computation of [sigma v] as a function of plasma parameters. Ion distributions are modified to include slowing-down contributions which are characterized in terms of plasma parameters. Rapid and accurate algorithms are used for integrating [sigma v] from cross sections and spectra. The main program solves for [sigma v] by the method of steepest descent. However, options are provided to use Gauss-Hermite and dense trapezoidal quadrature integration techniques. Options are also provided for rapid calculation of screening effects on specific reaction rates. Although such effects are not significant in cases of plasmas of laboratory interest, the options are included to increase the range of applicability of the code. Gamow penetration form, log-log interpolation, and cubic interpolation routines are included to provide the interpolated values of cross sections

  17. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  18. Correlated sampling added to the specific purpose Monte Carlo code McPNL for neutron lifetime log responses

    International Nuclear Information System (INIS)

    Mickael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    The specific purpose neutron lifetime oil well logging simulation code, McPNL, has been rewritten for greater user-friendliness and faster execution. Correlated sampling has been added to the code to enable studies of relative changes in the tool response caused by environmental changes. The absolute responses calculated by the code have been benchmarked against laboratory test pit data. The relative responses from correlated sampling are not directly benchmarked, but they are validated using experimental and theoretical results

  19. Determination of Solution Accuracy of Numerical Schemes as Part of Code and Calculation Verification

    Energy Technology Data Exchange (ETDEWEB)

    Blottner, F.G.; Lopez, A.R.

    1998-10-01

    This investigation is concerned with the accuracy of numerical schemes for solving partial differential equations used in science and engineering simulation codes. Richardson extrapolation methods for steady and unsteady problems with structured meshes are presented as part of the verification procedure to determine code and calculation accuracy. The local truncation error de- termination of a numerical difference scheme is shown to be a significant component of the veri- fication procedure as it determines the consistency of the numerical scheme, the order of the numerical scheme, and the restrictions on the mesh variation with a non-uniform mesh. Genera- tion of a series of co-located, refined meshes with the appropriate variation of mesh cell size is in- vestigated and is another important component of the verification procedure. The importance of mesh refinement studies is shown to be more significant than just a procedure to determine solu- tion accuracy. It is suggested that mesh refinement techniques can be developed to determine con- sistency of numerical schemes and to determine if governing equations are well posed. The present investigation provides further insight into the conditions and procedures required to effec- tively use Richardson extrapolation with mesh refinement studies to achieve confidence that sim- ulation codes are producing accurate numerical solutions.

  20. Administrative database code accuracy did not vary notably with changes in disease prevalence.

    Science.gov (United States)

    van Walraven, Carl; English, Shane; Austin, Peter C

    2016-11-01

    Previous mathematical analyses of diagnostic tests based on the categorization of a continuous measure have found that test sensitivity and specificity varies significantly by disease prevalence. This study determined if the accuracy of diagnostic codes varied by disease prevalence. We used data from two previous studies in which the true status of renal disease and primary subarachnoid hemorrhage, respectively, had been determined. In multiple stratified random samples from the two previous studies having varying disease prevalence, we measured the accuracy of diagnostic codes for each disease using sensitivity, specificity, and positive and negative predictive value. Diagnostic code sensitivity and specificity did not change notably within clinically sensible disease prevalence. In contrast, positive and negative predictive values changed significantly with disease prevalence. Disease prevalence had no important influence on the sensitivity and specificity of diagnostic codes in administrative databases. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    International Nuclear Information System (INIS)

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given

  2. Advanced thermionic reactor systems design code

    International Nuclear Information System (INIS)

    Lewis, B.R.; Pawlowski, R.A.; Greek, K.J.; Klein, A.C.

    1991-01-01

    An overall systems design code is under development to model an advanced in-core thermionic nuclear reactor system for space applications at power levels of 10 to 50 kWe. The design code is written in an object-oriented programming environment that allows the use of a series of design modules, each of which is responsible for the determination of specific system parameters. The code modules include a neutronics and core criticality module, a core thermal hydraulics module, a thermionic fuel element performance module, a radiation shielding module, a module for waste heat transfer and rejection, and modules for power conditioning and control. The neutronics and core criticality module determines critical core size, core lifetime, and shutdown margins using the criticality calculation capability of the Monte Carlo Neutron and Photon Transport Code System (MCNP). The remaining modules utilize results of the MCNP analysis along with FORTRAN programming to predict the overall system performance

  3. Development of computer code for determining prediction parameters of radionuclide migration in soil layer

    International Nuclear Information System (INIS)

    Ogawa, Hiromichi; Ohnuki, Toshihiko

    1986-07-01

    A computer code (MIGSTEM-FIT) has been developed to determine the prediction parameters, retardation factor, water flow velocity, dispersion coefficient, etc., of radionuclide migration in soil layer from the concentration distribution of radionuclide in soil layer or in effluent. In this code, the solution of the predicting equation for radionuclide migration is compared with the concentration distribution measured, and the most adequate values of parameter can be determined by the flexible tolerance method. The validity of finite differential method, which was one of the method to solve the predicting equation, was confirmed by comparison with the analytical solution, and also the validity of fitting method was confirmed by the fitting of the concentration distribution calculated from known parameters. From the examination about the error, it was found that the error of the parameter obtained by using this code was smaller than that of the concentration distribution measured. (author)

  4. The computer code SEURBNUK/EURDYN (release 1). Input and output specifications

    International Nuclear Information System (INIS)

    Smith, B.L.; Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK-2 is a two-dimensional, axisymmetric, Eulerian, finite difference containment code developed initially by AWRE Aldermaston, AEE Winfrith and JRC-Ispra, and more recently by AEEW, JRC and EIR Wuerenlingen. The numerical procedure adopted in SEURBNUK to solve the hydrodynamic equations is based on the semi-implicit ICE method which itself is an extension of the MAC algorithm. SEURBNUK has a finite difference thin shell treatment for vessels and internal structures of arbitrary shape and includes the effects of the compressibility of the fluid. Fluid flow through porous media and porous structures can also be accommodated. SEURBNUK/EURDYN is an extension of SEURBNUK-2 in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin or thick structures. This has been achieved by coupling the finite element code EURDYN with SEURBNUK-2, allowing the use of conical shell elements and axisymmetric triangular elements. Within the code, the equations of motion for the structures are solved quite separately from those for the fluid, and the timestep for the fluid can be an integer multiple of that for the structures. The interaction of the structures with the fluid is then considered as a modification to the coefficients in the pressure equations, the modifications naturally depending on the behaviour of the structures within the fluid cell. The code is limited to dealing with a single fluid, the coolant, and the bubble and the cover gas are treated as cavities of uniform pressure calculated via appropriate pressure-volume-energy relationships. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations. After explaining the output facilities information is included to aid users to avoid some common pit-falls. (author)

  5. Software requirements specification document for the AREST code development

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.; Whitney, P.D.; Gray, W.J.; Williford, R.E.; White, M.D.; Eslinger, P.W.; Altenhofen, M.K.

    1993-11-01

    The Analysis of the Repository Source Term (AREST) computer code was selected in 1992 by the U.S. Department of Energy. The AREST code will be used to analyze the performance of an underground high level nuclear waste repository. The AREST code is being modified by the Pacific Northwest Laboratory (PNL) in order to evaluate the engineered barrier and waste package designs, model regulatory compliance, analyze sensitivities, and support total systems performance assessment modeling. The current version of the AREST code was developed to be a very useful tool for analyzing model uncertainties and sensitivities to input parameters. The code has also been used successfully in supplying source-terms that were used in a total systems performance assessment. The current version, however, has been found to be inadequate for the comparison and selection of a design for the waste package. This is due to the assumptions and simplifications made in the selection of the process and system models. Thus, the new version of the AREST code will be designed to focus on the details of the individual processes and implementation of more realistic models. This document describes the requirements of the new models that will be implemented. Included in this document is a section describing the near-field environmental conditions for this waste package modeling, description of the new process models that will be implemented, and a description of the computer requirements for the new version of the AREST code

  6. Determining the specific electric resistance of rock

    Energy Technology Data Exchange (ETDEWEB)

    Persad' ko, V.Ia.

    1982-01-01

    Data are presented on perfecting the method of laboratory determination of the specific electric resistance of a rock formation. The average error in determining the specific electric resistance of the core at various locations is no more than two percent with low resistance values (2-5 ohms).

  7. Code conforming determination of cumulative usage factors for general elastic-plastic finite element analyses

    International Nuclear Information System (INIS)

    Rudolph, Juergen; Goetz, Andreas; Hilpert, Roland

    2012-01-01

    The procedures of fatigue analyses of several relevant nuclear and conventional design codes (ASME, KTA, EN, AD) for power plant components differentiate between an elastic, simplified elastic-plastic and elastic-plastic fatigue check. As a rule, operational load levels will exclude the purely elastic fatigue check. The application of the code procedure of the simplified elastic-plastic fatigue check is common practice. Nevertheless, resulting cumulative usage factors may be overly conservative mainly due to high code based plastification penalty factors Ke. As a consequence, the more complex and still code conforming general elastic-plastic fatigue analysis methodology based on non-linear finite element analysis (FEA) is applied for fatigue design as an alternative. The requirements of the FEA and the material law to be applied have to be clarified in a first step. Current design codes only give rough guidelines on these relevant items. While the procedure for the simplified elastic-plastic fatigue analysis and the associated code passages are based on stress related cycle counting and the determination of pseudo elastic equivalent stress ranges, an adaptation to elastic-plastic strains and strain ranges is required for the elastic-plastic fatigue check. The associated requirements are explained in detail in the paper. If the established and implemented evaluation mechanism (cycle counting according to the peak and valley respectively the rainflow method, calculation of stress ranges from arbitrary load-time histories and determination of cumulative usage factors based on all load events) is to be retained, a conversion of elastic-plastic strains and strain ranges into pseudo elastic stress ranges is required. The algorithm to be applied is described in the paper. It has to be implemented in the sense of an extended post processing operation of FEA e.g. by APDL scripts in ANSYS registered . Variations of principal stress (strain) directions during the loading

  8. Analysis of genetic code ambiguity arising from nematode-specific misacylated tRNAs.

    Directory of Open Access Journals (Sweden)

    Kiyofumi Hamashima

    Full Text Available The faithful translation of the genetic code requires the highly accurate aminoacylation of transfer RNAs (tRNAs. However, it has been shown that nematode-specific V-arm-containing tRNAs (nev-tRNAs are misacylated with leucine in vitro in a manner that transgresses the genetic code. nev-tRNA(Gly (CCC and nev-tRNA(Ile (UAU, which are the major nev-tRNA isotypes, could theoretically decode the glycine (GGG codon and isoleucine (AUA codon as leucine, causing GGG and AUA codon ambiguity in nematode cells. To test this hypothesis, we investigated the functionality of nev-tRNAs and their impact on the proteome of Caenorhabditis elegans. Analysis of the nucleotide sequences in the 3' end regions of the nev-tRNAs showed that they had matured correctly, with the addition of CCA, which is a crucial posttranscriptional modification required for tRNA aminoacylation. The nuclear export of nev-tRNAs was confirmed with an analysis of their subcellular localization. These results show that nev-tRNAs are processed to their mature forms like common tRNAs and are available for translation. However, a whole-cell proteome analysis found no detectable level of nev-tRNA-induced mistranslation in C. elegans cells, suggesting that the genetic code is not ambiguous, at least under normal growth conditions. Our findings indicate that the translational fidelity of the nematode genetic code is strictly maintained, contrary to our expectations, although deviant tRNAs with misacylation properties are highly conserved in the nematode genome.

  9. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  10. Gender specific determinants of goitre

    International Nuclear Information System (INIS)

    Jamshid, F.; Kerstin, C.; Elena, G.; Wilhelm, O.; Karl, W.; Hwe, M.

    2004-01-01

    Despite the strong implications of differences between females and males in the risk of goitre, gender-specific issues have not been extensively addressed in investigations of goitre prevalence. The Objective of our analysis was to investigate the gender-specific determinants of goitre. Methods: A total of 853 healthy employees from 4 institutions in western part of Germany aged between 18 and 68 years were examined by ultrasound of the neck to determine the thyroid volume between April 2001 and April 2002. Information on sex, age, daily use of iodised salt, the history of goitre in the first degree relatives, type and amount of' smoking, oral contraceptives and number of pregnancies were assessed by standardised questionnaires. Gender-specific predictors of goitre prevalence were assessed by multivariate logistic regression. Results: The overall prevalence of goitre among study subjects was (204/853) 23.9%. Goitre was present in 80 out of 370 females (21.6%) vs. 124/483 (25.7%) in males.In general smoking (<0.0001), increasing age (p<0.0001) and lack of daily intake of iodised salt (p=0.004) associated with goitre prevalence, but not sex (0.4) and family history of goitre (p=0.2). In 370 females, parity (p=0.004) and lack of daily intake of iodised salt (p=0.01) were the major determinants for goitre, whereas, age (p=0.2), oral contraceptives (p=0.8), family history of goitre (p=0.3), and smoking (p=0.1) did not affect the goitre prevalence. In 483 males, smoking (p<0.0001) and age (p<0.001) affected the goitre prevalence, but not family history of goitre (p=0.4) and the iodine status failed just to reach the significant level (p=0.08) in this analysis. Conclusions: Gender specific determinants of goitre are parity and iodine status in females and smoking and increasing age in males. (authors)

  11. SASSYS LMFBR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.

    1982-01-01

    The SASSYS code provides detailed steady-state and transient thermal-hydraulic analyses of the reactor core, inlet and outlet coolant plenums, primary and intermediate heat-removal systems, steam generators, and emergency shut-down heat removal systems in liquid-metal-cooled fast-breeder reactors (LMFBRs). The main purpose of the code is to analyze the consequences of failures in the shut-down heat-removal system and to determine whether this system can perform its mission adequately even with some of its components inoperable. The code is not plant-specific. It is intended for use with any LMFBR, using either a loop or a pool design, a once-through steam generator or an evaporator-superheater combination, and either a homogeneous core or a heterogeneous core with internal-blanket assemblies

  12. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  13. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  14. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  15. Divergent evolutionary rates in vertebrate and mammalian specific conserved non-coding elements (CNEs) in echolocating mammals.

    Science.gov (United States)

    Davies, Kalina T J; Tsagkogeorga, Georgia; Rossiter, Stephen J

    2014-12-19

    The majority of DNA contained within vertebrate genomes is non-coding, with a certain proportion of this thought to play regulatory roles during development. Conserved Non-coding Elements (CNEs) are an abundant group of putative regulatory sequences that are highly conserved across divergent groups and thus assumed to be under strong selective constraint. Many CNEs may contain regulatory factor binding sites, and their frequent spatial association with key developmental genes - such as those regulating sensory system development - suggests crucial roles in regulating gene expression and cellular patterning. Yet surprisingly little is known about the molecular evolution of CNEs across diverse mammalian taxa or their role in specific phenotypic adaptations. We examined 3,110 vertebrate-specific and ~82,000 mammalian-specific CNEs across 19 and 9 mammalian orders respectively, and tested for changes in the rate of evolution of CNEs located in the proximity of genes underlying the development or functioning of auditory systems. As we focused on CNEs putatively associated with genes underlying the development/functioning of auditory systems, we incorporated echolocating taxa in our dataset because of their highly specialised and derived auditory systems. Phylogenetic reconstructions of concatenated CNEs broadly recovered accepted mammal relationships despite high levels of sequence conservation. We found that CNE substitution rates were highest in rodents and lowest in primates, consistent with previous findings. Comparisons of CNE substitution rates from several genomic regions containing genes linked to auditory system development and hearing revealed differences between echolocating and non-echolocating taxa. Wider taxonomic sampling of four CNEs associated with the homeobox genes Hmx2 and Hmx3 - which are required for inner ear development - revealed family-wise variation across diverse bat species. Specifically within one family of echolocating bats that utilise

  16. Numerical code to determine the particle trapping region in the LISA machine

    International Nuclear Information System (INIS)

    Azevedo, M.T. de; Raposo, C.C. de; Tomimura, A.

    1984-01-01

    A numerical code is constructed to determine the trapping region in machine like LISA. The variable magnetic field is two deimensional and is coupled to the Runge-Kutta through the Tchebichev polynomial. Various particle orbits including particle interactions were analysed. Beside this, a strong electric field is introduced to see the possible effects happening inside the plasma. (Author) [pt

  17. Inheritance-mode specific pathogenicity prioritization (ISPP) for human protein coding genes.

    Science.gov (United States)

    Hsu, Jacob Shujui; Kwan, Johnny S H; Pan, Zhicheng; Garcia-Barcelo, Maria-Mercè; Sham, Pak Chung; Li, Miaoxin

    2016-10-15

    Exome sequencing studies have facilitated the detection of causal genetic variants in yet-unsolved Mendelian diseases. However, the identification of disease causal genes among a list of candidates in an exome sequencing study is still not fully settled, and it is often difficult to prioritize candidate genes for follow-up studies. The inheritance mode provides crucial information for understanding Mendelian diseases, but none of the existing gene prioritization tools fully utilize this information. We examined the characteristics of Mendelian disease genes under different inheritance modes. The results suggest that Mendelian disease genes with autosomal dominant (AD) inheritance mode are more haploinsufficiency and de novo mutation sensitive, whereas those autosomal recessive (AR) genes have significantly more non-synonymous variants and regulatory transcript isoforms. In addition, the X-linked (XL) Mendelian disease genes have fewer non-synonymous and synonymous variants. As a result, we derived a new scoring system for prioritizing candidate genes for Mendelian diseases according to the inheritance mode. Our scoring system assigned to each annotated protein-coding gene (N = 18 859) three pathogenic scores according to the inheritance mode (AD, AR and XL). This inheritance mode-specific framework achieved higher accuracy (area under curve  = 0.84) in XL mode. The inheritance-mode specific pathogenicity prioritization (ISPP) outperformed other well-known methods including Haploinsufficiency, Recessive, Network centrality, Genic Intolerance, Gene Damage Index and Gene Constraint scores. This systematic study suggests that genes manifesting disease inheritance modes tend to have unique characteristics. ISPP is included in KGGSeq v1.0 (http://grass.cgs.hku.hk/limx/kggseq/), and source code is available from (https://github.com/jacobhsu35/ISPP.git). mxli@hku.hkSupplementary information: Supplementary data are available at Bioinformatics online. © The Author

  18. DISCURSIVE ACTUALIZATION OF ETHNO-LINGUOCULTURAL CODE IN ENGLISH GLUTTONY

    Directory of Open Access Journals (Sweden)

    Nikishkova Mariya Sergeevna

    2014-11-01

    Full Text Available The article presents the overview of linguistic research on gastronomic / gluttony communicative environment as ethnocultural phenomenon from the standpoint of conceptology, discourse study and linguosemiotics. The authors study the linguosemiotic encoding / decoding in the English gastronomic (gluttony discourse. The peculiarities of gastronomic gluttonyms "immersion" into everyday communication are studied. The anglophone ethnicities are revealed and different ways of gluttony texts (including the precedent ones formation are investigated. The linguosemiotic parameters of ethnocultural (anglophone gastronomic coded communication are established, their discursive characteristics are identified. It is determined that in English gastronomic communication, the discursive actualization of ethno-linguocultural code has a dynamic nature; the constitutive features of gastronomic discourse have symbolic (semiotic basics and are connected with such semiotic categories as code, encoding, decoding. It was found that food is semiotic in its origin and represents the cultural code. It was revealed that the semiosis of English gastronomic text is regularly filled with the codes of traditional "English-likeness" (ethnic term by Roland Barthes expressed by gluttonyms. "Nationality" code is detected through the names of products specific to certain areas; national identity of ethnic code also allows highlighting ways of dish garnishing and serving, typical characteristics of particular local preparation methods. The authors analyze the "lingualization" of food images having an ambivalent character, determined, firstly, by food signs (gluttonyms which structure the common space of gastronomic discourse and provide it with ethnic linguocultural food source; secondly, by immerging formed images into a specific ethnic code that is decoded in gastronomic discourse unfolding. The precedent texts accumulate ethnic information supplying adequate gastronomic worldview

  19. Use of allele-specific FAIRE to determine functional regulatory polymorphism using large-scale genotyping arrays.

    Directory of Open Access Journals (Sweden)

    Andrew J P Smith

    Full Text Available Following the widespread use of genome-wide association studies (GWAS, focus is turning towards identification of causal variants rather than simply genetic markers of diseases and traits. As a step towards a high-throughput method to identify genome-wide, non-coding, functional regulatory variants, we describe the technique of allele-specific FAIRE, utilising large-scale genotyping technology (FAIRE-gen to determine allelic effects on chromatin accessibility and regulatory potential. FAIRE-gen was explored using lymphoblastoid cells and the 50,000 SNP Illumina CVD BeadChip. The technique identified an allele-specific regulatory polymorphism within NR1H3 (coding for LXR-α, rs7120118, coinciding with a previously GWAS-identified SNP for HDL-C levels. This finding was confirmed using FAIRE-gen with the 200,000 SNP Illumina Metabochip and verified with the established method of TaqMan allelic discrimination. Examination of this SNP in two prospective Caucasian cohorts comprising 15,000 individuals confirmed the association with HDL-C levels (combined beta = 0.016; p = 0.0006, and analysis of gene expression identified an allelic association with LXR-α expression in heart tissue. Using increasingly comprehensive genotyping chips and distinct tissues for examination, FAIRE-gen has the potential to aid the identification of many causal SNPs associated with disease from GWAS.

  20. Hazardous Waste Code Determination for First/Second-Stage Sludge Waste Stream (IDCs 001, 002, 800)

    International Nuclear Information System (INIS)

    Arbon, R.E.

    2001-01-01

    This document, Hazardous Waste Code Determination for the First/Second-Stage Sludge Waste Stream, summarizes the efforts performed at the Idaho National Engineering and Environmental Laboratory (INEEL) to make a hazardous waste code determination on Item Description Codes (IDCs) 001, 002, and 800 drums. This characterization effort included a thorough review of acceptable knowledge (AK), physical characterization, waste form sampling, chemical analyses, and headspace gas data. This effort included an assessment of pre-Waste Analysis Plan (WAP) solidified sampling and analysis data (referred to as preliminary data). Seventy-five First/Second-Stage Sludge Drums, provided in Table 1-1, have been subjected to core sampling and analysis using the requirements defined in the Quality Assurance Program Plan (QAPP). Based on WAP defined statistical reduction, of preliminary data, a sample size of five was calculated. That is, five additional drums should be core sampled and analyzed. A total of seven drums were sampled, analyzed, and validated in compliance with the WAP criteria. The pre-WAP data (taken under the QAPP) correlated very well with the WAP compliant drum data. As a result, no additional sampling is required. Based upon the information summarized in this document, an accurate hazardous waste determination has been made for the First/Second-Stage Sludge Waste Stream

  1. Specificity Protein (Sp) Transcription Factors and Metformin Regulate Expression of the Long Non-coding RNA HULC

    Science.gov (United States)

    There is evidence that specificity protein 1 (Sp1) transcription factor (TF) regulates expression of long non-coding RNAs (lncRNAs) in hepatocellular carcinoma (HCC) cells. RNA interference (RNAi) studies showed that among several lncRNAs expressed in HepG2, SNU-449 and SK-Hep-1...

  2. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  3. Determination of allergen specificity by heavy chains in grass pollen allergen-specific IgE antibodies.

    Science.gov (United States)

    Gadermaier, Elisabeth; Flicker, Sabine; Lupinek, Christian; Steinberger, Peter; Valenta, Rudolf

    2013-04-01

    Affinity and clonality of allergen-specific IgE antibodies are important determinants for the magnitude of IgE-mediated allergic inflammation. We sought to analyze the contribution of heavy and light chains of human allergen-specific IgE antibodies for allergen specificity and to test whether promiscuous pairing of heavy and light chains with different allergen specificity allows binding and might affect affinity. Ten IgE Fabs specific for 3 non-cross-reactive major timothy grass pollen allergens (Phl p 1, Phl p 2, and Phl p 5) obtained by means of combinatorial cloning from patients with grass pollen allergy were used to construct stable recombinant single chain variable fragments (ScFvs) representing the original Fabs and shuffled ScFvs in which heavy chains were recombined with light chains from IgE Fabs with specificity for other allergens by using the pCANTAB 5 E expression system. Possible ancestor genes for the heavy chain and light chain variable region-encoding genes were determined by using sequence comparison with the ImMunoGeneTics database, and their chromosomal locations were determined. Recombinant ScFvs were tested for allergen specificity and epitope recognition by means of direct and sandwich ELISA, and affinity by using surface plasmon resonance experiments. The shuffling experiments demonstrate that promiscuous pairing of heavy and light chains is possible and maintains allergen specificity, which is mainly determined by the heavy chains. ScFvs consisting of different heavy and light chains exhibited different affinities and even epitope specificity for the corresponding allergen. Our results indicate that allergen specificity of allergen-specific IgE is mainly determined by the heavy chains. Different heavy and light chain pairings in allergen-specific IgE antibodies affect affinity and epitope specificity and thus might influence clinical reactivity to allergens. Copyright © 2012 American Academy of Allergy, Asthma & Immunology. Published by

  4. On-line monitoring and inservice inspection in codes

    International Nuclear Information System (INIS)

    Bartonicek, J.; Zaiss, W.; Bath, H.R.

    1999-01-01

    The relevant regulatory codes determine the ISI tasks and the time intervals for recurrent components testing for evaluation of operation-induced damaging or ageing in order to ensure component integrity on the basis of the last available quality data. In-service quality monitoring is carried out through on-line monitoring and recurrent testing. The requirements defined by the engineering codes elaborated by various institutions are comparable, with the KTA nuclear engineering and safety codes being the most complete provisions for quality evaluation and assurance after different, defined service periods. German conventional codes for assuring component integrity provide exclusively for recurrent inspection regimes (mainly pressure tests and optical testing). The requirements defined in the KTA codes however always demanded more specific inspections relying on recurrent testing as well as on-line monitoring. Foreign codes for ensuring component integrity concentrate on NDE tasks at regular time intervals, with time intervals scope of testing activities being defined on the basis of the ASME code, section XI. (orig./CB) [de

  5. Comprehensive Identification of Long Non-coding RNAs in Purified Cell Types from the Brain Reveals Functional LncRNA in OPC Fate Determination.

    Directory of Open Access Journals (Sweden)

    Xiaomin Dong

    2015-12-01

    Full Text Available Long non-coding RNAs (lncRNAs (> 200 bp play crucial roles in transcriptional regulation during numerous biological processes. However, it is challenging to comprehensively identify lncRNAs, because they are often expressed at low levels and with more cell-type specificity than are protein-coding genes. In the present study, we performed ab initio transcriptome reconstruction using eight purified cell populations from mouse cortex and detected more than 5000 lncRNAs. Predicting the functions of lncRNAs using cell-type specific data revealed their potential functional roles in Central Nervous System (CNS development. We performed motif searches in ENCODE DNase I digital footprint data and Mouse ENCODE promoters to infer transcription factor (TF occupancy. By integrating TF binding and cell-type specific transcriptomic data, we constructed a novel framework that is useful for systematically identifying lncRNAs that are potentially essential for brain cell fate determination. Based on this integrative analysis, we identified lncRNAs that are regulated during Oligodendrocyte Precursor Cell (OPC differentiation from Neural Stem Cells (NSCs and that are likely to be involved in oligodendrogenesis. The top candidate, lnc-OPC, shows highly specific expression in OPCs and remarkable sequence conservation among placental mammals. Interestingly, lnc-OPC is significantly up-regulated in glial progenitors from experimental autoimmune encephalomyelitis (EAE mouse models compared to wild-type mice. OLIG2-binding sites in the upstream regulatory region of lnc-OPC were identified by ChIP (chromatin immunoprecipitation-Sequencing and validated by luciferase assays. Loss-of-function experiments confirmed that lnc-OPC plays a functional role in OPC genesis. Overall, our results substantiated the role of lncRNA in OPC fate determination and provided an unprecedented data source for future functional investigations in CNS cell types. We present our datasets and

  6. Determination of Major and Minor Elements in the Code River Sediments

    International Nuclear Information System (INIS)

    Sri Murniasih; Sukirno; Bambang Irianto

    2007-01-01

    Analyze major and minor elements in the Code river sediments has been done. The aim of this research is to determine the concentration of major and minor elements in the Code river sediments from upstream to downstream. The instrument used were X-ray Fluorescence using Si(Li) detector. The results show that major elements were Fe (1.66 ± 0.1% - 4.20 ± 0.7%) and Ca (4.43 ± 0.6% - 9.08 ± 1.3%); while minor elements were Ba (178.791 ± 21.1 ppm - 616.56 ± 59.4 ppm); Sr (148.22 ± 21.9 ppm - 410.25 ± 30.5 ppm); and Zr (9.71 ± 1.1 ppm - 22.11 ± 3.4 ppm). ANAVA method (confidence level of α 0.05 ) for statistic test was used. It was showed that there were significant influence of the sampling location difference on the concentration of major and minor elements in the sediment samples. (author)

  7. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  8. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  9. Determining coding CpG islands by identifying regions significant for pattern statistics on Markov chains.

    Science.gov (United States)

    Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior

    2011-09-23

    Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.

  10. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  11. Capsid coding sequences of foot-and-mouth disease viruses are determinants of pathogenicity in pigs.

    Science.gov (United States)

    Lohse, Louise; Jackson, Terry; Bøtner, Anette; Belsham, Graham J

    2012-05-24

    The surface exposed capsid proteins, VP1, VP2 and VP3, of foot-and-mouth disease virus (FMDV) determine its antigenicity and the ability of the virus to interact with host-cell receptors. Hence, modification of these structural proteins may alter the properties of the virus.In the present study we compared the pathogenicity of different FMDVs in young pigs. In total 32 pigs, 7-weeks-old, were exposed to virus, either by direct inoculation or through contact with inoculated pigs, using cell culture adapted (O1K B64), chimeric (O1K/A-TUR and O1K/O-UKG) or field strain (O-UKG/34/2001) viruses. The O1K B64 virus and the two chimeric viruses are identical to each other except for the capsid coding region.Animals exposed to O1K B64 did not exhibit signs of disease, while pigs exposed to each of the other viruses showed typical clinical signs of foot-and-mouth disease (FMD). All pigs infected with the O1K/O-UKG chimera or the field strain (O-UKG/34/2001) developed fulminant disease. Furthermore, 3 of 4 in-contact pigs exposed to the O1K/O-UKG virus died in the acute phase of infection, likely from myocardial infection. However, in the group exposed to the O1K/A-TUR chimeric virus, only 1 pig showed symptoms of disease within the time frame of the experiment (10 days). All pigs that developed clinical disease showed a high level of viral RNA in serum and infected pigs that survived the acute phase of infection developed a serotype specific antibody response. It is concluded that the capsid coding sequences are determinants of FMDV pathogenicity in pigs.

  12. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  13. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    Science.gov (United States)

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  14. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  15. Principle of the determination of neutron multiplication coefficients by the Monte Carlo method. Application. Description of a code for ibm 360-75; Principe de la determination des coefficients de multiplication neutronique par methode de Monte-Carlo. Application. Description d'un code pour IBM 360-75

    Energy Technology Data Exchange (ETDEWEB)

    Moreau, J; Parisot, B [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    The determination of neutron multiplication coefficients by the Monte Carlo method can be carried out in different ways; the are first examined particularly complex geometries; it makes use of multi-group isotropic cross sections. The performances of this code are illustrated by some examples. (author) [French] La determination des coefficients de multiplication neutronique par methode de Monte Carlo peut se faire par differentes voies, elles sont successivement examinees et comparees. On en deduit un code rapide pour des geometries particulierement complexes, il utilise des sections efficaces multigroupes isotropes. Les performances de ce code sont demontrees par quelques exemples. (auteur)

  16. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  17. A model code on co-determination and CSR : The Netherlands: A bottom-up approach

    NARCIS (Netherlands)

    Lambooy, T.E.

    2011-01-01

    This article discusses the works council’s role in the determination of a company’s CSR strategy and the implementation thereof throughout the organisation. The association of the works councils of multinational companies with a base in the Netherlands has recently developed a ‘Model Code on

  18. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  19. Geometrical modification transfer between specific meshes of each coupled physical codes. Application to the Jules Horowitz research reactor experimental devices

    International Nuclear Information System (INIS)

    Duplex, B.

    2011-01-01

    The CEA develops and uses scientific software, called physical codes, in various physical disciplines to optimize installation and experimentation costs. During a study, several physical phenomena interact, so a code coupling and some data exchanges between different physical codes are required. Each physical code computes on a particular geometry, usually represented by a mesh composed of thousands to millions of elements. This PhD Thesis focuses on the geometrical modification transfer between specific meshes of each coupled physical code. First, it presents a physical code coupling method where deformations are computed by one of these codes. Next, it discusses the establishment of a model, common to different physical codes, grouping all the shared data. Finally, it covers the deformation transfers between meshes of the same geometry or adjacent geometries. Geometrical modifications are discrete data because they are based on a mesh. In order to permit every code to access deformations and to transfer them, a continuous representation is computed. Two functions are developed, one with a global support, and the other with a local support. Both functions combine a simplification method and a radial basis function network. A whole use case is dedicated to the Jules Horowitz reactor. The effect of differential dilatations on experimental device cooling is studied. (author) [fr

  20. Quantum Codes From Negacyclic Codes over Group Ring ( Fq + υFq) G

    International Nuclear Information System (INIS)

    Koroglu, Mehmet E.; Siap, Irfan

    2016-01-01

    In this paper, we determine self dual and self orthogonal codes arising from negacyclic codes over the group ring ( F q + υF q ) G . By taking a suitable Gray image of these codes we obtain many good parameter quantum error-correcting codes over F q . (paper)

  1. ALOAD - a code to determine the concentrated forces equivalent with a distributed pressure field for a FEM analysis

    Directory of Open Access Journals (Sweden)

    Nicolae APOSTOLESCU

    2010-12-01

    Full Text Available The main objective of this paper is to describe a code for calculating an equivalent systemof concentrate loads for a FEM analysis. The tables from the Aerodynamic Department containpressure field for a whole bearing surface, and integrated quantities both for the whole surface andfor fixed and mobile part. Usually in a FEM analysis the external loads as concentrated loadsequivalent to the distributed pressure field are introduced. These concentrated forces can also be usedin static tests. Commercial codes provide solutions for this problem, but what we intend to develop isa code adapted to the user’s specific needs.

  2. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  3. Use of AERIN code for determining internal doses of transuranic isotopes

    International Nuclear Information System (INIS)

    King, W.C.

    1980-01-01

    The AERIN computer code is a mathematical expression of the ICRP Lung Model. The code was developed at the Lawrence Livermore National Laboratory to compute the body organ burdens and absorbed radiation doses resulting from the inhalation of transuranic isotopes and to predict the amount of activity excreted in the urine and feces as a function of time. Over forty cases of internal exposure have been studied using the AERIN code. The code, as modified, has proven to be extremely versatile. The case studies presented demonstrate the excellent correlation that can be obtained between code predictions and observed bioassay data. In one case study a discrepancy was observed between an in vivo count of the whole body and the application of the code using urine and fecal data as input. The discrepancy was resolved by in vivo skull counts that showed the code had predicted the correct skeletal burden

  4. QR Code: An Interactive Mobile Advertising Tool

    Directory of Open Access Journals (Sweden)

    Ela Sibel Bayrak Meydanoglu

    2013-10-01

    Full Text Available Easy and rapid interaction between consumers and marketers enabled by mobile technology prompted  an increase in the usage of mobile media as an interactive marketing tool in recent years. One of the mobile technologies that can be used in interactive marketing for advertising is QR code (Quick Response Code. Interactive advertising brings back some advantages for the companies that apply it. For example, interaction with consumers provides significant information about consumers' preferences. Marketers can use information obtained from consumers for various marketing activities such as customizing advertisement messages, determining  target audience, improving future products and services. QR codes used in marketing campaigns can provide links to specific websites in which through various tools (e.g. questionnaires, voting information about the needs and wants of customers are collected. The aim of this basic research is to illustrate the contribution of  QR codes to the realization of the advantages gained by interactive advertising.

  5. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  6. Absorbed dose determination in high energy photon beams using new IAEA TRS - 398 Code of Practice

    International Nuclear Information System (INIS)

    Suriyapee, S.; Srimanoroath, S.; Jumpangern, C.

    2002-01-01

    The absorbed dose calibration of 6 and 10 MV X-ray beams from Varian Clinac 1800 at King Chulalongkorn Memorial Hospital Bangkok, Thailand were performed using cylindrical chamber 0.6 cc NE2571 Serial No. 1633 with graphite wall and Delrin build up cap and lonex Dosemaster NE 2590 Serial No. 223. The absorbed dose determination followed the IAEA code of practice TRS-277. The new IAEA code of practice TRS-398 have been studied to compare the result with the IAEA TRS-277

  7. A test of the IAEA code of practice for absorbed dose determination in photon and electron beams

    International Nuclear Information System (INIS)

    Leitner, A.; Tiefenboeck, W.; Witzani, J.; Strachotinsky, C.

    1990-12-01

    The IAEA Code of Practice TRS 277 gives recommendations for absorbed dose determination in high energy photon and electron beams based on the use of ionisation chambers calibrated in terms of exposure or air kerma. The scope of the present work was to test the Code for 60 Co gamma radiation and for several radiation qualities at four different types of electron accelerators and to compare the ionisation chamber dosimetry with ferrous sulphate dosimetry. The results show agreement between the two methods within about one per cent for all the investigated qualities. In addition the response of the TLD capsules of the IAEA/WHO TL dosimetry service has been determined. (Authors) 5 refs., 9 tabs., 3 figs

  8. Lightweight Detection of Android-specific Code Smells : The aDoctor Project

    NARCIS (Netherlands)

    Palomba, F.; Di Nucci, D.; Panichella, A.; Zaidman, A.E.; De Lucia, Andrea; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Code smells are symptoms of poor design solutions applied by programmers during the development of software systems. While the research community devoted a lot of effort to studying and devising approaches for detecting the traditional code smells defined by Fowler, little knowledge and support

  9. Country-specific determinants of world university rankings

    OpenAIRE

    Pietrucha, Jacek

    2017-01-01

    This paper examines country-specific factors that affect the three most influential world university rankings (the Academic Ranking of World Universities, the QS World University Ranking, and the Times Higher Education World University Ranking). We run a cross sectional regression that covers 42–71 countries (depending on the ranking and data availability). We show that the position of universities from a country in the ranking is determined by the following country-specific variables: econom...

  10. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  11. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  12. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  13. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  14. Determination of the NPP Krsko reactor core safety limits using the COBRA-III-C code

    International Nuclear Information System (INIS)

    Lajtman, S.; Feretic, D.; Debrecin, N.

    1989-01-01

    This paper presents the NPP Krsko reactor core safety limits determined by the COBRA-III-C code, along with the methodology used. The reactor core safety limits determination is a part of reactor protection limits procedure. The results obtained were compared to safety limits presented in NPP Krsko FSAR. The COBRA-III-C NPP Krsko design core steady state thermal hydraulics calculation, used as the basis for the safety limits calculation, is presented as well. (author)

  15. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  16. Confidence level in the calculations of HCDA consequences using large codes

    International Nuclear Information System (INIS)

    Nguyen, D.H.; Wilburn, N.P.

    1979-01-01

    The probabilistic approach to nuclear reactor safety is playing an increasingly significant role. For the liquid-metal fast breeder reactor (LMFBR) in particular, the ultimate application of this approach could be to determine the probability of achieving the goal of a specific line-of-assurance (LOA). Meanwhile a more pressing problem is one of quantifying the uncertainty in a calculated consequence for hypothetical core disruptive accident (HCDA) using large codes. Such uncertainty arises from imperfect modeling of phenomenology and/or from inaccuracy in input data. A method is presented to determine the confidence level in consequences calculated by a large computer code due to the known uncertainties in input invariables. A particular application was made to the initial time of pin failure in a transient overpower HCDA calculated by the code MELT-IIIA in order to demonstrate the method. A probability distribution function (pdf) for the time of failure was first constructed, then the confidence level for predicting this failure parameter within a desired range was determined

  17. Data model description for the DESCARTES and CIDER codes

    International Nuclear Information System (INIS)

    Miley, T.B.; Ouderkirk, S.J.; Nichols, W.E.; Eslinger, P.W.

    1993-01-01

    The primary objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. One of the major objectives of the HEDR Project is to develop several computer codes to model the airborne releases. transport and envirorunental accumulation of radionuclides resulting from Hanford operations from 1944 through 1972. In July 1992, the HEDR Project Manager determined that the computer codes being developed (DESCARTES, calculation of environmental accumulation from airborne releases, and CIDER, dose calculations from environmental accumulation) were not sufficient to create accurate models. A team of HEDR staff members developed a plan to assure that computer codes would meet HEDR Project goals. The plan consists of five tasks: (1) code requirements definition. (2) scoping studies, (3) design specifications, (4) benchmarking, and (5) data modeling. This report defines the data requirements for the DESCARTES and CIDER codes

  18. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  19. Determining mode excitations of vacuum electronics devices via three-dimensional simulations using the SOS code

    Science.gov (United States)

    Warren, Gary

    1988-01-01

    The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.

  20. HYDROCOIN [HYDROlogic COde INtercomparison] Level 1: Benchmarking and verification test results with CFEST [Coupled Fluid, Energy, and Solute Transport] code: Draft report

    International Nuclear Information System (INIS)

    Yabusaki, S.; Cole, C.; Monti, A.M.; Gupta, S.K.

    1987-04-01

    Part of the safety analysis is evaluating groundwater flow through the repository and the host rock to the accessible environment by developing mathematical or analytical models and numerical computer codes describing the flow mechanisms. This need led to the establishment of an international project called HYDROCOIN (HYDROlogic COde INtercomparison) organized by the Swedish Nuclear Power Inspectorate, a forum for discussing techniques and strategies in subsurface hydrologic modeling. The major objective of the present effort, HYDROCOIN Level 1, is determining the numerical accuracy of the computer codes. The definition of each case includes the input parameters, the governing equations, the output specifications, and the format. The Coupled Fluid, Energy, and Solute Transport (CFEST) code was applied to solve cases 1, 2, 4, 5, and 7; the Finite Element Three-Dimensional Groundwater (FE3DGW) Flow Model was used to solve case 6. Case 3 has been ignored because unsaturated flow is not pertinent to SRP. This report presents the Level 1 results furnished by the project teams. The numerical accuracy of the codes is determined by (1) comparing the computational results with analytical solutions for cases that have analytical solutions (namely cases 1 and 4), and (2) intercomparing results from codes for cases which do not have analytical solutions (cases 2, 5, 6, and 7). Cases 1, 2, 6, and 7 relate to flow analyses, whereas cases 4 and 5 require nonlinear solutions. 7 refs., 71 figs., 9 tabs

  1. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  2. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  3. 21 CFR 864.9320 - Copper sulfate solution for specific gravity determinations.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Copper sulfate solution for specific gravity... Establishments That Manufacture Blood and Blood Products § 864.9320 Copper sulfate solution for specific gravity determinations. (a) Identification. A copper sulfate solution for specific gravity determinations is a device...

  4. Concentration of acrylamide in a polyacrylamide gel affects VP4 gene coding assignment of group A equine rotavirus strains with P[12] specificity

    Science.gov (United States)

    2010-01-01

    Background It is universally acknowledged that genome segment 4 of group A rotavirus, the major etiologic agent of severe diarrhea in infants and neonatal farm animals, encodes outer capsid neutralization and protective antigen VP4. Results To determine which genome segment of three group A equine rotavirus strains (H-2, FI-14 and FI-23) with P[12] specificity encodes the VP4, we analyzed dsRNAs of strains H-2, FI-14 and FI-23 as well as their reassortants by polyacrylamide gel electrophoresis (PAGE) at varying concentrations of acrylamide. The relative position of the VP4 gene of the three equine P[12] strains varied (either genome segment 3 or 4) depending upon the concentration of acrylamide. The VP4 gene bearing P[3], P[4], P[6], P[7], P[8] or P[18] specificity did not exhibit this phenomenon when the PAGE running conditions were varied. Conclusions The concentration of acrylamide in a PAGE gel affected VP4 gene coding assignment of equine rotavirus strains bearing P[12] specificity. PMID:20573245

  5. Radiological impact assessment in Malaysia using RESRAD computer code

    International Nuclear Information System (INIS)

    Syed Hakimi Sakuma Syed Ahmad; Khairuddin Mohamad Kontol; Razali Hamzah

    1999-01-01

    Radiological Impact Assessment (RIA) can be conducted in Malaysia by using the RESRAD computer code developed by Argonne National Laboratory, U.S.A. The code can do analysis to derive site specific guidelines for allowable residual concentrations of radionuclides in soil. Concepts of the RIA in the context of waste management concern in Malaysia, some regulatory information and assess status of data collection are shown. Appropriate use scenarios and site specific parameters are used as much as possible so as to be realistic so that will reasonably ensure that individual dose limits and or constraints will be achieved. Case study have been conducted to fulfil Atomic Energy Licensing Board (AELB) requirements where for disposal purpose the operator must be required to carry out. a radiological impact assessment to all proposed disposals. This is to demonstrate that no member of public will be exposed to more than 1 mSv/year from all activities. Results obtained from analyses show the RESRAD computer code is able to calculate doses, risks, and guideline values. Sensitivity analysis by the computer code shows that the parameters used as input are justified so as to improve confidence to the public and the AELB the results of the analysis. The computer code can also be used as an initial assessment to conduct screening assessment in order to determine a proper disposal site. (Author)

  6. A regulatory code for neuron-specific odor receptor expression.

    Directory of Open Access Journals (Sweden)

    Anandasankar Ray

    2008-05-01

    Full Text Available Olfactory receptor neurons (ORNs must select-from a large repertoire-which odor receptors to express. In Drosophila, most ORNs express one of 60 Or genes, and most Or genes are expressed in a single ORN class in a process that produces a stereotyped receptor-to-neuron map. The construction of this map poses a problem of receptor gene regulation that is remarkable in its dimension and about which little is known. By using a phylogenetic approach and the genome sequences of 12 Drosophila species, we systematically identified regulatory elements that are evolutionarily conserved and specific for individual Or genes of the maxillary palp. Genetic analysis of these elements supports a model in which each receptor gene contains a zip code, consisting of elements that act positively to promote expression in a subset of ORN classes, and elements that restrict expression to a single ORN class. We identified a transcription factor, Scalloped, that mediates repression. Some elements are used in other chemosensory organs, and some are conserved upstream of axon-guidance genes. Surprisingly, the odor response spectra and organization of maxillary palp ORNs have been extremely well-conserved for tens of millions of years, even though the amino acid sequences of the receptors are not highly conserved. These results, taken together, define the logic by which individual ORNs in the maxillary palp select which odor receptors to express.

  7. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  8. DATING: A computer code for determining allowable temperatures for dry storage of spent fuel in inert and nitrogen gases

    International Nuclear Information System (INIS)

    Simonen, E.P.; Gilbert, E.R.

    1988-12-01

    The DATING (Determining Allowable Temperatures in Inert and Nitrogen Gases) code can be used to calculate allowable initial temperatures for dry storage of light-water-reactor spent fuel. The calculations are based on the life fraction rule using both measured data and mechanistic equations as reported by Chin et al. (1986). The code is written in FORTRAN and utilizes an efficient numerical integration method for rapid calculations on IBM-compatible personal computers. This report documents the technical basis for the DATING calculations, describes the computational method and code statements, and includes a user's guide with examples. The software for the DATING code is available through the National Energy Software Center operated by Argonne National Laboratory, Argonne, Illinois 60439. 5 refs., 8 figs., 5 tabs

  9. Acquisition and evolution of plant pathogenesis-associated gene clusters and candidate determinants of tissue-specificity in xanthomonas.

    Directory of Open Access Journals (Sweden)

    Hong Lu

    Full Text Available Xanthomonas is a large genus of plant-associated and plant-pathogenic bacteria. Collectively, members cause diseases on over 392 plant species. Individually, they exhibit marked host- and tissue-specificity. The determinants of this specificity are unknown.To assess potential contributions to host- and tissue-specificity, pathogenesis-associated gene clusters were compared across genomes of eight Xanthomonas strains representing vascular or non-vascular pathogens of rice, brassicas, pepper and tomato, and citrus. The gum cluster for extracellular polysaccharide is conserved except for gumN and sequences downstream. The xcs and xps clusters for type II secretion are conserved, except in the rice pathogens, in which xcs is missing. In the otherwise conserved hrp cluster, sequences flanking the core genes for type III secretion vary with respect to insertion sequence element and putative effector gene content. Variation at the rpf (regulation of pathogenicity factors cluster is more pronounced, though genes with established functional relevance are conserved. A cluster for synthesis of lipopolysaccharide varies highly, suggesting multiple horizontal gene transfers and reassortments, but this variation does not correlate with host- or tissue-specificity. Phylogenetic trees based on amino acid alignments of gum, xps, xcs, hrp, and rpf cluster products generally reflect strain phylogeny. However, amino acid residues at four positions correlate with tissue specificity, revealing hpaA and xpsD as candidate determinants. Examination of genome sequences of xanthomonads Xylella fastidiosa and Stenotrophomonas maltophilia revealed that the hrp, gum, and xcs clusters are recent acquisitions in the Xanthomonas lineage.Our results provide insight into the ancestral Xanthomonas genome and indicate that differentiation with respect to host- and tissue-specificity involved not major modifications or wholesale exchange of clusters, but subtle changes in a small

  10. MOCARS: a Monte Carlo code for determining the distribution and simulation limits

    International Nuclear Information System (INIS)

    Matthews, S.D.

    1977-07-01

    MOCARS is a computer program designed for the INEL CDC 76-173 operating system to determine the distribution and simulation limits for a function by Monte Carlo techniques. The code randomly samples data from any of the 12 user-specified distributions and then either evaluates the cut set system unavailability or a user-specified function with the sample data. After the data are ordered, the values at various quantities and associated confidence bounds are calculated for output. Also available for output on microfilm are the frequency and cumulative distribution histograms from the sample data. 29 figures, 4 tables

  11. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  12. The hierarchy-by-interval approach to identifying important models that need improvement in severe-accident simulation codes

    International Nuclear Information System (INIS)

    Heames, T.J.; Khatib-Rahbar, M.; Kelly, J.E.

    1995-01-01

    The hierarchy-by-interval (HBI) methodology was developed to determine an appropriate phenomena identification and ranking table for an independent peer review of severe-accident computer codes. The methodology is described, and the results of a specific code review are presented. Use of this systematic and structured approach ensures that important code models that need improvement are identified and prioritized, which allows code sponsors to more effectively direct limited resources in future code development. In addition, critical phenomenological areas that need more fundamental work, such as experimentation, are identified

  13. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  14. Conservation and sex-specific splicing of the doublesex gene

    Indian Academy of Sciences (India)

    Genetic control of sex determination in insects has been best characterized in Drosophila melanogaster, where the master gene Sxl codes for RNA that is sex specifically spliced to produce a functional protein only in females. SXL regulates the sex-specific splicing of transformer (tra) RNA which, in turn, regulates the ...

  15. Improvement of Secret Image Invisibility in Circulation Image with Dyadic Wavelet Based Data Hiding with Run-Length Coded Secret Images of Which Location of Codes are Determined with Random Number

    OpenAIRE

    Kohei Arai; Yuji Yamada

    2011-01-01

    An attempt is made for improvement of secret image invisibility in circulation images with dyadic wavelet based data hiding with run-length coded secret images of which location of codes are determined by random number. Through experiments, it is confirmed that secret images are almost invisible in circulation images. Also robustness of the proposed data hiding method against data compression of circulation images is discussed. Data hiding performance in terms of invisibility of secret images...

  16. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal-neutron-activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, M.C.

    1979-01-01

    A computer code that automatically analyzes gamma-ray spectra obtained with Ge(Li) detectors is described. The program contains such features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification, and calculation of masses and errors. Finally, the results obtained with this computer code for a lunar sample are reported and briefly discussed

  17. Pump Component Model in SPACE Code

    International Nuclear Information System (INIS)

    Kim, Byoung Jae; Kim, Kyoung Doo

    2010-08-01

    This technical report describes the pump component model in SPACE code. A literature survey was made on pump models in existing system codes. The models embedded in SPACE code were examined to check the confliction with intellectual proprietary rights. Design specifications, computer coding implementation, and test results are included in this report

  18. DETERMINATION OF THE SPECIFIC GROWTH RATE ON ...

    African Journals Online (AJOL)

    Sewage generation is one of the dense problems Nigerians encounter on daily bases, mostly at the urbanized area where factories and industries are located. This paper is aimed at determining the specific growth rate “K” of biological activities on cassava wastewater during degradation using Michaelis-Menten Equation.

  19. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  20. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  1. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  2. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  3. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  4. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal neutron activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, C.

    1978-01-01

    A description is given of a computer code which automatically analyses gamma-ray spectra obtained with Ge(Li) detectors. The program contains features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification and calculation of masses and errors. Finally the results obtained with our computer code for a lunar sample are reported and briefly discussed

  5. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  6. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  7. Code Shift: Grid Specifications and Dynamic Wind Turbine Models

    DEFF Research Database (Denmark)

    Ackermann, Thomas; Ellis, Abraham; Fortmann, Jens

    2013-01-01

    Grid codes (GCs) and dynamic wind turbine (WT) models are key tools to allow increasing renewable energy penetration without challenging security of supply. In this article, the state of the art and the further development of both tools are discussed, focusing on the European and North American e...

  8. Conceptual Approach to Forming the Basic Code of Neo-Industrial Development of a Region

    Directory of Open Access Journals (Sweden)

    Elena Leonidovna Andreeva

    2017-09-01

    Full Text Available In the article, the authors propose the conceptual fundamentals of the “code approach” to the regional neo-industrial development. The purpose of the research is to reveal the essence of the transition to a new type of industrial and economic relations through a prism of “genetic codes” of the region. We consider these codes as a system of the “racial memory” of a territory, which determines the specificity and features of neo-industrialization realization. We substantiated the hypothesis about the influence of the “genetic codes” of the region on the effectiveness of the neo-industrialization. We have defined the participants, or else the carriers of the codes in the transformation of regional inheritance for the stimulation of the neoindustrial development of region’s economy. The subject matter of the research is the distinctive features of the functioning of the determinative region’s codes. Their content determines the socio-economic specificity of the region and the features of innovative, informational, value-based and competence-based development of the territory. The determinative codes generate the dynamic codes of the region, which are understood as their derivatives. They have a high probability of occurrence, higher speed of development and distribution, internal forces that make possible the self-development of the region. The scientific contribution is the substantiation of the basic code of the regional neo-industrial development. It represents the evolutionary accumulation of the rapid changes of its innovative, informational, value-based and competence-based codes stimulating the generation and implementation of new ideas regarding to economic entities adapted to the historical and cultural conditions. The article presents the code model of neo-industrial development of the region described by formulas. We applied the system analysis methods, historical and civilization approaches, evolutionary and

  9. ASME Code requirements for multi-canister overpack design and fabrication

    International Nuclear Information System (INIS)

    SMITH, K.E.

    1998-01-01

    The baseline requirements for the design and fabrication of the MCO include the application of the technical requirements of the ASME Code, Section III, Subsection NB for containment and Section III, Subsection NG for criticality control. ASME Code administrative requirements, which have not historically been applied at the Hanford site and which have not been required by the US Nuclear Regulatory Commission (NRC) for licensed spent fuel casks/canisters, were not invoked for the MCO. As a result of recommendations made from an ASME Code consultant in response to DNFSB staff concerns regarding ASME Code application, the SNF Project will be making the following modifications: issue an ASME Code Design Specification and Design Report, certified by a Registered Professional Engineer; Require the MCO fabricator to hold ASME Section III or Section VIII, Division 2 accreditation; and Use ASME Authorized Inspectors for MCO fabrication. Incorporation of these modifications will ensure that the MCO is designed and fabricated in accordance with the ASME Code. Code Stamping has not been a requirement at the Hanford site, nor for NRC licensed spent fuel casks/canisters, but will be considered if determined to be economically justified

  10. Country-specific determinants of world university rankings.

    Science.gov (United States)

    Pietrucha, Jacek

    2018-01-01

    This paper examines country-specific factors that affect the three most influential world university rankings (the Academic Ranking of World Universities, the QS World University Ranking, and the Times Higher Education World University Ranking). We run a cross sectional regression that covers 42-71 countries (depending on the ranking and data availability). We show that the position of universities from a country in the ranking is determined by the following country-specific variables: economic potential of the country, research and development expenditure, long-term political stability (freedom from war, occupation, coups and major changes in the political system), and institutional variables, including government effectiveness.

  11. Determining Attitudes of Postgraduate Students towards Scientific Research and Codes of Conduct, Supported by Digital Script

    Science.gov (United States)

    Tavukcu, Tahir

    2016-01-01

    In this research, it is aimed to determine the effect of the attitudes of postgraduate students towards scientific research and codes of conduct, supported by digital script. This research is a quantitative study, and it has been formed according to pre-test & post-test research model of experiment and control group. In both groups, lessons…

  12. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  13. Non-coding RNAs in the Ovarian Follicle

    Directory of Open Access Journals (Sweden)

    Rosalia Battaglia

    2017-05-01

    Full Text Available The mammalian ovarian follicle is the complex reproductive unit comprising germ cell, somatic cells (Cumulus and Granulosa cells, and follicular fluid (FF: paracrine communication among the different cell types through FF ensures the development of a mature oocyte ready for fertilization. This paper is focused on non-coding RNAs in ovarian follicles and their predicted role in the pathways involved in oocyte growth and maturation. We determined the expression profiles of microRNAs in human oocytes and FF by high-throughput analysis and identified 267 microRNAs in FF and 176 in oocytes. Most of these were FF microRNAs, while 9 were oocyte specific. By bioinformatic analysis, independently performed on FF and oocyte microRNAs, we identified the most significant Biological Processes and the pathways regulated by their validated targets. We found many pathways shared between the two compartments and some specific for oocyte microRNAs. Moreover, we found 41 long non-coding RNAs able to interact with oocyte microRNAs and potentially involved in the regulation of folliculogenesis. These data are important in basic reproductive research and could also be useful for clinical applications. In fact, the characterization of non-coding RNAs in ovarian follicles could improve reproductive disease diagnosis, provide biomarkers of oocyte quality in Assisted Reproductive Treatment, and allow the development of therapies for infertility disorders.

  14. Critical lengths of error events in convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1994-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  15. Critical Lengths of Error Events in Convolutional Codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Andersen, Jakob Dahl

    1998-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  16. CODE's new solar radiation pressure model for GNSS orbit determination

    Science.gov (United States)

    Arnold, D.; Meindl, M.; Beutler, G.; Dach, R.; Schaer, S.; Lutz, S.; Prange, L.; Sośnica, K.; Mervart, L.; Jäggi, A.

    2015-08-01

    The Empirical CODE Orbit Model (ECOM) of the Center for Orbit Determination in Europe (CODE), which was developed in the early 1990s, is widely used in the International GNSS Service (IGS) community. For a rather long time, spurious spectral lines are known to exist in geophysical parameters, in particular in the Earth Rotation Parameters (ERPs) and in the estimated geocenter coordinates, which could recently be attributed to the ECOM. These effects grew creepingly with the increasing influence of the GLONASS system in recent years in the CODE analysis, which is based on a rigorous combination of GPS and GLONASS since May 2003. In a first step we show that the problems associated with the ECOM are to the largest extent caused by the GLONASS, which was reaching full deployment by the end of 2011. GPS-only, GLONASS-only, and combined GPS/GLONASS solutions using the observations in the years 2009-2011 of a global network of 92 combined GPS/GLONASS receivers were analyzed for this purpose. In a second step we review direct solar radiation pressure (SRP) models for GNSS satellites. We demonstrate that only even-order short-period harmonic perturbations acting along the direction Sun-satellite occur for GPS and GLONASS satellites, and only odd-order perturbations acting along the direction perpendicular to both, the vector Sun-satellite and the spacecraft's solar panel axis. Based on this insight we assess in the third step the performance of four candidate orbit models for the future ECOM. The geocenter coordinates, the ERP differences w. r. t. the IERS 08 C04 series of ERPs, the misclosures for the midnight epochs of the daily orbital arcs, and scale parameters of Helmert transformations for station coordinates serve as quality criteria. The old and updated ECOM are validated in addition with satellite laser ranging (SLR) observations and by comparing the orbits to those of the IGS and other analysis centers. Based on all tests, we present a new extended ECOM which

  17. Quantitative software-reliability analysis of computer codes relevant to nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.

    1981-12-01

    This report presents the results of the first year of an ongoing research program to determine the probability of failure characteristics of computer codes relevant to nuclear safety. An introduction to both qualitative and quantitative aspects of nuclear software is given. A mathematical framework is presented which will enable the a priori prediction of the probability of failure characteristics of a code given the proper specification of its properties. The framework consists of four parts: (1) a classification system for software errors and code failures; (2) probabilistic modeling for selected reliability characteristics; (3) multivariate regression analyses to establish predictive relationships among reliability characteristics and generic code property and development parameters; and (4) the associated information base. Preliminary data of the type needed to support the modeling and the predictions of this program are described. Illustrations of the use of the modeling are given but the results so obtained, as well as all results of code failure probabilities presented herein, are based on data which at this point are preliminary, incomplete, and possibly non-representative of codes relevant to nuclear safety

  18. Analysis of preservice inspection relief requests and recommendations for ASME code changes

    International Nuclear Information System (INIS)

    Cook, J.F.

    1985-05-01

    NRC regulations require that preservice inspection (PSI) of nuclear plants be performed in accordance with referenced editions and addenda of Division 1 rules of Section XI, ''Rules for Inservice Inspection of Nuclear Power Plant Components'', of the ASME Boiler and Pressure Vessel Code (ASME Code). The regulations permit applicants to request and obtain relief from the NRC from specific ASME Code requirements that are determined to be impractical. Applicant requests for relief from preservice inspection (PSI) requirements were compiled and analyzed. From this data, covering a total of 178 relief requests, common problems with examination requirements were identified. Changes to examination requirements to solve selected problems are proposed. By following later ASME Code requirements, 46 out of the 178 relief requests can be eliminated. Implementing proposed Code changes would eliminate another 25 relief requests, leaving 107 relief requests out of the original 178 relief requests covered by this survey

  19. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    output includes a plot of the MAAP calculation and the plant data. For the large integral experiments, a major part, but not all of the MAAP code is needed. These use an experiment specific benchmark routine that includes all of the information and boundary conditions for performing the calculation, as well as the information of which parts of MAAP are unnecessary and can be 'bypassed'. Lastly, the separate effects tests only require a few MAAP routines. These are exercised through their own specific benchmark routine that includes the experiment specific information and boundary conditions. This benchmark routine calls the appropriate MAAP routines from the source code, performs the calculations, including integration where necessary and provide the comparison between the MAAP calculation and the experimental observations. (author)

  20. Order functions and evaluation codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pellikaan, Ruud; van Lint, Jack

    1997-01-01

    Based on the notion of an order function we construct and determine the parameters of a class of error-correcting evaluation codes. This class includes the one-point algebraic geometry codes as wella s the generalized Reed-Muller codes and the parameters are detremined without using the heavy...... machinery of algebraic geometry....

  1. A molecular-gap device for specific determination of mercury ions

    Science.gov (United States)

    Guo, Zheng; Liu, Zhong-Gang; Yao, Xian-Zhi; Zhang, Kai-Sheng; Chen, Xing; Liu, Jin-Huai; Huang, Xing-Jiu

    2013-11-01

    Specific determination/monitoring of trace mercury ions (Hg2+) in environmental water is of significant importance for drinking safety. Complementarily to conventional inductively coupled plasma mass spectrometry and atomic emission/absorption spectroscopy, several methods, i.e., electrochemical, fluorescent, colorimetric, and surface enhanced Raman scattering approaches, have been developed recently. Despite great success, many inevitably encounter the interferences from other metal ions besides the complicated procedures and sophisticated equipments. Here we present a molecular-gap device for specific determination of trace Hg2+ in both standardized solutions and environmental samples based on conductivity-modulated glutathione dimer. Through a self-assembling technique, a thin film of glutathione monolayer capped Au nanoparticles is introduced into 2.5 μm-gap-electrodes, forming numerous double molecular layer gaps. Notably, the fabricated molecular-gap device shows a specific response toward Hg2+ with a low detection limit actually measured down to 1 nM. Theoretical calculations demonstrate that the specific sensing mechanism greatly depends on the electron transport ability of glutathione dimer bridged by heavy metal ions, which is determined by its frontier molecular orbital, not the binding energy.

  2. Hermitian self-dual quasi-abelian codes

    Directory of Open Access Journals (Sweden)

    Herbert S. Palines

    2017-12-01

    Full Text Available Quasi-abelian codes constitute an important class of linear codes containing theoretically and practically interesting codes such as quasi-cyclic codes, abelian codes, and cyclic codes. In particular, the sub-class consisting of 1-generator quasi-abelian codes contains large families of good codes. Based on the well-known decomposition of quasi-abelian codes, the characterization and enumeration of Hermitian self-dual quasi-abelian codes are given. In the case of 1-generator quasi-abelian codes, we offer necessary and sufficient conditions for such codes to be Hermitian self-dual and give a formula for the number of these codes. In the case where the underlying groups are some $p$-groups, the actual number of resulting Hermitian self-dual quasi-abelian codes are determined.

  3. A Systematic Review of Coding Systems Used in Pharmacoepidemiology and Database Research.

    Science.gov (United States)

    Chen, Yong; Zivkovic, Marko; Wang, Tongtong; Su, Su; Lee, Jianyi; Bortnichak, Edward A

    2018-02-01

    Clinical coding systems have been developed to translate real-world healthcare information such as prescriptions, diagnoses and procedures into standardized codes appropriate for use in large healthcare datasets. Due to the lack of information on coding system characteristics and insufficient uniformity in coding practices, there is a growing need for better understanding of coding systems and their use in pharmacoepidemiology and observational real world data research. To determine: 1) the number of available coding systems and their characteristics, 2) which pharmacoepidemiology databases are they adopted in, 3) what outcomes and exposures can be identified from each coding system, and 4) how robust they are with respect to consistency and validity in pharmacoepidemiology and observational database studies. Electronic literature database and unpublished literature searches, as well as hand searching of relevant journals were conducted to identify eligible articles discussing characteristics and applications of coding systems in use and published in the English language between 1986 and 2016. Characteristics considered included type of information captured by codes, clinical setting(s) of use, adoption by a pharmacoepidemiology database, region, and available mappings. Applications articles describing the use and validity of specific codes, code lists, or algorithms were also included. Data extraction was performed independently by two reviewers and a narrative synthesis was performed. A total of 897 unique articles and 57 coding systems were identified, 17% of which included country-specific modifications or multiple versions. Procedures (55%), diagnoses (36%), drugs (38%), and site of disease (39%) were most commonly and directly captured by these coding systems. The systems were used to capture information from the following clinical settings: inpatient (63%), ambulatory (55%), emergency department (ED, 34%), and pharmacy (13%). More than half of all coding

  4. Utility experience in code updating of equipment built to 1974 code, Section 3, Subsection NF

    International Nuclear Information System (INIS)

    Rao, K.R.; Deshpande, N.

    1990-01-01

    This paper addresses changes to ASME Code Subsection NF and reconciles the differences between the updated codes and the as built construction code, of ASME Section III, 1974 to which several nuclear plants have been built. Since Section III is revised every three years and replacement parts complying with the construction code are invariably not available from the plant stock inventory, parts must be procured from vendors who comply with the requirements of the latest codes. Aspects of the ASME code which reflect Subsection NF are identified and compared with the later Code editions and addenda, especially up to and including the 1974 ASME code used as the basis for the plant qualification. The concern of the regulatory agencies is that if later code allowables and provisions are adopted it is possible to reduce the safety margins of the construction code. Areas of concern are highlighted and the specific changes of later codes are discerned; adoption of which, would not sacrifice the intended safety margins of the codes to which plants are licensed

  5. Tandem Mirror Reactor Systems Code (Version I)

    International Nuclear Information System (INIS)

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost

  6. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  7. Practical analysis of specificity-determining residues in protein families.

    Science.gov (United States)

    Chagoyen, Mónica; García-Martín, Juan A; Pazos, Florencio

    2016-03-01

    Determining the residues that are important for the molecular activity of a protein is a topic of broad interest in biomedicine and biotechnology. This knowledge can help understanding the protein's molecular mechanism as well as to fine-tune its natural function eventually with biotechnological or therapeutic implications. Some of the protein residues are essential for the function common to all members of a family of proteins, while others explain the particular specificities of certain subfamilies (like binding on different substrates or cofactors and distinct binding affinities). Owing to the difficulty in experimentally determining them, a number of computational methods were developed to detect these functional residues, generally known as 'specificity-determining positions' (or SDPs), from a collection of homologous protein sequences. These methods are mature enough for being routinely used by molecular biologists in directing experiments aimed at getting insight into the functional specificity of a family of proteins and eventually modifying it. In this review, we summarize some of the recent discoveries achieved through SDP computational identification in a number of relevant protein families, as well as the main approaches and software tools available to perform this type of analysis. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. A method for scientific code coupling in a distributed environment; Une methodologie pour le couplage de codes scientifiques en environnement distribue

    Energy Technology Data Exchange (ETDEWEB)

    Caremoli, C; Beaucourt, D; Chen, O; Nicolas, G; Peniguel, C; Rascle, P; Richard, N; Thai Van, D; Yessayan, A

    1994-12-01

    This guide book deals with coupling of big scientific codes. First, the context is introduced: big scientific codes devoted to a specific discipline coming to maturity, and more and more needs in terms of multi discipline studies. Then we describe different kinds of code coupling and an example of code coupling: 3D thermal-hydraulic code THYC and 3D neutronics code COCCINELLE. With this example we identify problems to be solved to realize a coupling. We present the different numerical methods usable for the resolution of coupling terms. This leads to define two kinds of coupling: with the leak coupling, we can use explicit methods, and with the strong coupling we need to use implicit methods. On both cases, we analyze the link with the way of parallelizing code. For translation of data from one code to another, we define the notion of Standard Coupling Interface based on a general structure for data. This general structure constitutes an intermediary between the codes, thus allowing a relative independence of the codes from a specific coupling. The proposed method for the implementation of a coupling leads to a simultaneous run of the different codes, while they exchange data. Two kinds of data communication with message exchange are proposed: direct communication between codes with the use of PVM product (Parallel Virtual Machine) and indirect communication with a coupling tool. This second way, with a general code coupling tool, is based on a coupling method, and we strongly recommended to use it. This method is based on the two following principles: re-usability, that means few modifications on existing codes, and definition of a code usable for coupling, that leads to separate the design of a code usable for coupling from the realization of a specific coupling. This coupling tool available from beginning of 1994 is described in general terms. (authors). figs., tabs.

  9. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  10. Development of the Computer Code to Determine an Individual Radionuclides in the Rad-wastes Container for Ulchin Units 3 and 4

    Energy Technology Data Exchange (ETDEWEB)

    Kang, D.W.; Chi, J.H.; Goh, E.O. [Korea Electric Power Research Institute, Taejon (Korea)

    2001-07-01

    A computer program, RASSAY was developed to evaluate accurately the activities of various nuclides in the rad-waste container for Ulchin units 3 and 4. This is the final report of the project, {sup D}evelopment of the Computer Code to Determine an Individual Radionuclides in the Rad-wastes Container for Ulchin Units 3 and 4 and includes the followings; 1) Structure of the computer code, RASSAY 2) An example of surface dose calculation by computer simulation using MCNP code 3) Methods of sampling and activity measurement of various Rad-wastes. (author). 21 refs., 35 figs., 6 tabs.

  11. Technical Specifications of Structural Health Monitoring for Highway Bridges: New Chinese Structural Health Monitoring Code

    Directory of Open Access Journals (Sweden)

    Fernando Moreu

    2018-03-01

    Full Text Available Governments and professional groups related to civil engineering write and publish standards and codes to protect the safety of critical infrastructure. In recent decades, countries have developed codes and standards for structural health monitoring (SHM. During this same period, rapid growth in the Chinese economy has led to massive development of civil engineering infrastructure design and construction projects. In 2016, the Ministry of Transportation of the People’s Republic of China published a new design code for SHM systems for large highway bridges. This document is the first technical SHM code by a national government that enforces sensor installation on highway bridges. This paper summarizes the existing international technical SHM codes for various countries and compares them with the new SHM code required by the Chinese Ministry of Transportation. This paper outlines the contents of the new Chinese SHM code and explains its relevance for the safety and management of large bridges in China, introducing key definitions of the Chinese–United States SHM vocabulary and their technical significance. Finally, this paper discusses the implications for the design and implementation of a future SHM codes, with suggestions for similar efforts in United States and other countries.

  12. System Design Description for the TMAD Code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System

  13. Triple-Frequency Code-Phase Combination Determination: A Comparison with the Hatch-Melbourne-Wübbena Combination Using BDS Signals

    Directory of Open Access Journals (Sweden)

    Chenlong Deng

    2018-02-01

    Full Text Available Considering the influence of the ionosphere, troposphere, and other systematic errors on double-differenced ambiguity resolution (AR, we present an optimal triple-frequency code-phase combination determination method driven by both the model and the real data. The new method makes full use of triple-frequency code measurements (especially the low-noise of the code on the B3 signal to minimize the total noise level and achieve the largest AR success rate (model-driven under different ionosphere residual situations (data-driven, thus speeding up the AR by directly rounding. With the triple-frequency Beidou Navigation Satellite System (BDS data collected at five stations from a continuously-operating reference station network in Guangdong Province of China, different testing scenarios are defined (a medium baseline, whose distance is between 20 km and 50 km; a medium-long baseline, whose distance is between 50 km and 100 km; and a long baseline, whose distance is larger than 100 km. The efficiency of the optimal code-phase combination on the AR success rate was compared with that of the geometry-free and ionosphere-free (GIF combination and the Hatch-Melbourne-Wübbena (HMW combination. Results show that the optimal combinations can always achieve better results than the HMW combination with B2 and B3 signals, especially when the satellite elevation angle is larger than 45°. For the wide-lane AR which aims to obtain decimeter-level kinematic positioning service, the standard deviation (STD of ambiguity residuals for the suboptimal combination are only about 0.2 cycles, and the AR success rate by directly rounding can be up to 99%. Compared with the HMW combinations using B1 and B2 signals and using B1 and B3 signals, the suboptimal combination achieves the best results in all baselines, with an overall improvement of about 40% and 20%, respectively. Additionally, the STD difference between the optimal and the GIF code-phase combinations decreases

  14. Determination of specific activity of phosphorus-32 labelled o-phosphoric acid

    International Nuclear Information System (INIS)

    Sane, S.U.

    2015-01-01

    Phosphorus-32 is one of the important radioisotopes used in therapeutic nuclear medicine. This work was aimed at developing a fast and sensitive procedure to determine trace amounts of 32 P which is present in various acidic chemical form thereby enabling to determine its specific activity. The method utilizes ammonium molybdate and metol for complexing with phosphorus in presence of sulphuric acid which was measured using UV-VIS spectrophotometer. The phosphate and molybdate ions form a stable complex which turns blue (molybdenum blue) by reduction with sulphuric acid. The absorbance of the complex thus formed was measured at 700 nm. Five batches of 32 P produced were analyzed using the procedure and specific activity was determined. It was found that radioactivity of 32 P did not interfere in absorbance measurements and the method could be successfully adopted for the determination of specific activity of 32 P. A scope of the method is to find the chemical purity of radioactive phosphorus ( 32 P) in quality control analysis. (author)

  15. Comparison of design margin for core shroud in between design and construction code and fitness-for-service code

    International Nuclear Information System (INIS)

    Dozaki, Koji

    2007-01-01

    Structural design methods for core shroud of BWR are specified in JSME Design and Construction Code, like ASME Boiler and Pressure Vessel Code Sec. III, as a part of core support structure. Design margins are defined according to combination of the structural design method selected and service limit considered. Basically, those margins in JSME Code were determined after ASME Sec. III. Designers can select so-called twice-slope method for core shroud design among those design methods. On the other hand, flaw evaluation rules have been established for core shroud in JSME Fitness-for-Service Code. Twice-slope method is also adopted for fracture evaluation in that code even when the core shroud contains a flaw. Design margin was determined as structural factors separately from Design and Construction Code. As a natural consequence, there is a difference in those design margins between the two codes. In this paper, it is shown that the design margin in Fitness-for-Service Code is conservative by experimental evidences. Comparison of design margins between the two codes is discussed. (author)

  16. The nuclear codes and guidelines

    International Nuclear Information System (INIS)

    Sonter, M.

    1984-01-01

    This paper considers problems faced by the mining industry when implementing the nuclear codes of practice. Errors of interpretation are likely. A major criticism is that the guidelines to the codes must be seen as recommendations only. They are not regulations. Specific clauses in the guidelines are criticised

  17. A rapid challenge protocol for determination of non-specific bronchial responsiveness

    DEFF Research Database (Denmark)

    Madsen, F; Nielsen, N H; Holstein-Rathlou, N H

    1986-01-01

    A rapid method for determination of non-specific bronchial hyperreactivity was developed. Resistance to breathing was determined by a modified expiratory airway interrupter technique and combined with a dosimeter-controlled nebulizer which made continuous determination of response possible during...... hyperreactivity since individual dose titration is easily performed, and the method could be valuable in epidemiological and occupational surveys as well.......A rapid method for determination of non-specific bronchial hyperreactivity was developed. Resistance to breathing was determined by a modified expiratory airway interrupter technique and combined with a dosimeter-controlled nebulizer which made continuous determination of response possible during...... challenge. The patients inhaled histamine chloride 8 mg/ml at every eighth breath until resistance to breathing (Rt) was increased by 60%. The number of inhalations (NI) or the provocative concentration (PC60-Rt) of histamine increasing Rt by 60% were determined in 68 patients. The new method correlated...

  18. Histone modification profiles are predictive for tissue/cell-type specific expression of both protein-coding and microRNA genes

    Directory of Open Access Journals (Sweden)

    Zhang Michael Q

    2011-05-01

    Full Text Available Abstract Background Gene expression is regulated at both the DNA sequence level and through modification of chromatin. However, the effect of chromatin on tissue/cell-type specific gene regulation (TCSR is largely unknown. In this paper, we present a method to elucidate the relationship between histone modification/variation (HMV and TCSR. Results A classifier for differentiating CD4+ T cell-specific genes from housekeeping genes using HMV data was built. We found HMV in both promoter and gene body regions to be predictive of genes which are targets of TCSR. For example, the histone modification types H3K4me3 and H3K27ac were identified as the most predictive for CpG-related promoters, whereas H3K4me3 and H3K79me3 were the most predictive for nonCpG-related promoters. However, genes targeted by TCSR can be predicted using other type of HMVs as well. Such redundancy implies that multiple type of underlying regulatory elements, such as enhancers or intragenic alternative promoters, which can regulate gene expression in a tissue/cell-type specific fashion, may be marked by the HMVs. Finally, we show that the predictive power of HMV for TCSR is not limited to protein-coding genes in CD4+ T cells, as we successfully predicted TCSR targeted genes in muscle cells, as well as microRNA genes with expression specific to CD4+ T cells, by the same classifier which was trained on HMV data of protein-coding genes in CD4+ T cells. Conclusion We have begun to understand the HMV patterns that guide gene expression in both tissue/cell-type specific and ubiquitous manner.

  19. Validation of coupled Relap5-3D code in the analysis of RBMK-1500 specific transients

    International Nuclear Information System (INIS)

    Evaldas, Bubelis; Algirdas, Kaliatka; Eugenijus, Uspuras

    2003-01-01

    This paper deals with the modelling of RBMK-1500 specific transients taking place at Ignalina NPP. These transients include: measurements of void and fast power reactivity coefficients, change of graphite cooling conditions and reactor power reduction transients. The simulation of these transients was performed using RELAP5-3D code model of RBMK-1500 reactor. At the Ignalina NPP void and fast power reactivity coefficients are measured on a regular basis and, based on the total reactor power, reactivity, control and protection system control rods positions and the main circulation circuit parameter changes during the experiments, the actual values of these reactivity coefficients are determined. Graphite temperature reactivity coefficient at the plant is determined by changing graphite cooling conditions in the reactor cavity. This type of transient is very unique and important from the gap between fuel channel and the graphite bricks model validation point of view. The measurement results, obtained during this transient, allowed to determine the thermal conductivity coefficient for this gap and to validate the graphite temperature reactivity feedback model. Reactor power reduction is a regular operation procedure during the entire lifetime of the reactor. In all cases it starts by either a scram or a power reduction signal activation by the reactor control and protection system or by an operator. The obtained calculation results demonstrate reasonable agreement with Ignalina NPP measured data. Behaviours of the separate MCC thermal-hydraulic parameters as well as physical processes are predicted reasonably well to the real processes, occurring in the primary circuit of RBMK-1500 reactor. Reasonable agreement of the measured and the calculated total reactor power change in time demonstrates the correct modelling of the neutronic processes taking place in RBMK- 1500 reactor core. And finally, the performed validation of RELAP5-3D model of Ignalina NPP RBMK-1500

  20. SPEAR-FCODE-GAMMA functional specifications. Final report

    International Nuclear Information System (INIS)

    Fiero, I.B.

    1983-03-01

    SPEAR FCODE GAMMA (SFG), a conceptual fuel-performance code for use in licensing analyses, has been defined and characterized as a set of functional specifications. The potential licensing-related applications of SFG are established and discussed. General code specifications including regulatory, interface, hardware application, code model and software, and operational specifications are discussed. The code input and output information including data requirements as well as formatting aspects are detailed. Finally, the SFG code-accuracy guidelines are established and the validation process is described

  1. Attitude Determination Error Analysis System (ADEAS) mathematical specifications document

    Science.gov (United States)

    Nicholson, Mark; Markley, F.; Seidewitz, E.

    1988-01-01

    The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.

  2. Certification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Toffer, H.; Crowe, R.D.; Ades, M.J.

    1990-05-01

    A certification plan for computer codes used in Safety Analyses and Probabilistic Risk Assessment (PRA) for the operation of the Savannah River Site (SRS) reactors has been prepared. An action matrix, checklists, and a time schedule have been included in the plan. These items identify what is required to achieve certification of the codes. A list of Safety Analysis and Probabilistic Risk Assessment (SA ampersand PRA) computer codes covered by the certification plan has been assembled. A description of each of the codes was provided in Reference 4. The action matrix for the configuration control plan identifies code specific requirements that need to be met to achieve the certification plan's objectives. The checklist covers the specific procedures that are required to support the configuration control effort and supplement the software life cycle procedures based on QAP 20-1 (Reference 7). A qualification checklist for users establishes the minimum prerequisites and training for achieving levels of proficiency in using configuration controlled codes for critical parameter calculations

  3. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  4. Determination of site-specific glycan heterogeneity on glycoproteins

    DEFF Research Database (Denmark)

    Kolarich, Daniel; Jensen, Pia Hønnerup; Altmann, Friedrich

    2012-01-01

    and the determination of site-specific glycan heterogeneity. The described workflow takes approximately 3-5 d, including sample preparation and data analysis. The data obtained from analyzing released glycans of rHuEPO and IgG, described in the second protocol of this series (10.1038/nprot.2012.063), provide...

  5. Some specifics considering the urban territories river discharge determination

    Directory of Open Access Journals (Sweden)

    Chilikova-Lubomirova Mila

    2018-01-01

    Full Text Available Urban territories are specific territories with a significant anthropogenic influence on the natural environment. As a result most of the existing natural conditions have been modified. Parts of them cover the natural forms of river beds and floodplains. Concerning to the humans safety, comfort and needs, while keeping ecosystems healthy function, different artificial structures also have been created. The process is connected to the well understanding and good quality data obtaining about the existing conditions and river flow behaviour, that are interconnected and relevant to the river discharge determination and its variations description – key issue for the entire river structures project, water extremes mitigation and maintaining a healthy state of the ecosystems. For the purpose various contact measurements and monitoring procedures are implemented. To clarify the process this material aims to present some specifics connected to the urban territories river discharge determination and the possibility for related monitoring networks creation. It is focused on the most used methods, their specifics and possible challenges for practical application. Main specifics connected to the related decision support systems creation and implementations are also presented. Main purpose is such state of the art dissemination, in help of decision makers and professionals in the area.

  6. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  7. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  8. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  9. Development of FBR integrity system code. Basic concept

    International Nuclear Information System (INIS)

    Asayama, Tai

    2001-05-01

    For fast breeder reactors to be commercialized, they must be more reliable, safer, and at the same, economically competitive with future light water reactors. Innovation of elevated temperature structural design standard is necessary to achieve this goal. The most powerful way is to enlarge the scope of structural integrity code to cover items other than design evaluation that has been addressed in existing codes. Items that must be newly covered are prerequisites of design, fabrication, examination, operation and maintenance, etc. This allows designers to choose the most economical combination of design variations to achieve specific reliability that is needed for a particular component. Designing components by this concept, a cost-minimum design of a whole plant can be realized. By determining the reliability that must be achieved for a component by risk technologies, further economical improvement can be expected by avoiding excessive quality. Recognizing the necessity for the codes based on the new concept, the development of 'FBR integrity system code' began in 2000. Research and development will last 10 years. For this development, the basic logistics and system as well as technologies that materialize the concept are necessary. Original logistics and system must be developed, because no existing researches are available in and out of Japan. This reports presents the results of the work done in the first year regarding the basic idea, methodology, and structure of the code. (author)

  10. Codes in the codons: construction of a codon/amino acid periodic table and a study of the nature of specific nucleic acid-protein interactions.

    Science.gov (United States)

    Benyo, B; Biro, J C; Benyo, Z

    2004-01-01

    The theory of "codon-amino acid coevolution" was first proposed by Woese in 1967. It suggests that there is a stereochemical matching - that is, affinity - between amino acids and certain of the base triplet sequences that code for those amino acids. We have constructed a common periodic table of codons and amino acids, where the nucleic acid table showed perfect axial symmetry for codons and the corresponding amino acid table also displayed periodicity regarding the biochemical properties (charge and hydrophobicity) of the 20 amino acids and the position of the stop signals. The table indicates that the middle (2/sup nd/) amino acid in the codon has a prominent role in determining some of the structural features of the amino acids. The possibility that physical contact between codons and amino acids might exist was tested on restriction enzymes. Many recognition site-like sequences were found in the coding sequences of these enzymes and as many as 73 examples of codon-amino acid co-location were observed in the 7 known 3D structures (December 2003) of endonuclease-nucleic acid complexes. These results indicate that the smallest possible units of specific nucleic acid-protein interaction are indeed the stereochemically compatible codons and amino acids.

  11. Coding theory and cryptography the essentials

    CERN Document Server

    Hankerson, DC; Leonard, DA; Phelps, KT; Rodger, CA; Wall, JR; Wall, J R

    2000-01-01

    Containing data on number theory, encryption schemes, and cyclic codes, this highly successful textbook, proven by the authors in a popular two-quarter course, presents coding theory, construction, encoding, and decoding of specific code families in an ""easy-to-use"" manner appropriate for students with only a basic background in mathematics offering revised and updated material on the Berlekamp-Massey decoding algorithm and convolutional codes. Introducing the mathematics as it is needed and providing exercises with solutions, this edition includes an extensive section on cryptography, desig

  12. Improving the accuracy of operation coding in surgical discharge summaries

    Science.gov (United States)

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  13. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  14. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  15. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  16. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  17. Capsid coding sequences of foot-and-mouth disease viruses are determinants of pathogenicity in pigs

    DEFF Research Database (Denmark)

    Lohse, Louise; Jackson, Terry; Bøtner, Anette

    2012-01-01

    The surface exposed capsid proteins, VP1, VP2 and VP3, of foot-and-mouth disease virus (FMDV) determine its antigenicity and the ability of the virus to interact with host-cell receptors. Hence, modification of these structural proteins may alter the properties of the virus. In the present study we...... compared the pathogenicity of different FMDVs in young pigs. In total 32 pigs, 7-weeks-old, were exposed to virus, either by direct inoculation or through contact with inoculated pigs, using cell culture adapted (O1K B64), chimeric (O1K/A-TUR and O1K/O-UKG) or field strain (O-UKG/34/2001) viruses. The O1K...... coding sequences are determinants of FMDV pathogenicity in pigs....

  18. Determination of the costs of the nuclear desalination using the DEEP code from IAEA

    International Nuclear Information System (INIS)

    Ramirez S, J.R.; Palacios H, J.C.; Alonso V, G.

    2005-01-01

    The desalination of seawater is being an important solution to satisfy the demands of drinking water to population's centers that have hydric resources very limited, like it is the case of some Arab countries and arid regions of the planet, in where they have settled desalination plants that use as energy source to those fossil fuels or nuclear energy plants. Taking into account that the desalination of seawater is a process that consumes a lot of thermal and/or electric energy, it is necessary to quantify the costs of the supply and that of the desalination plant for different options and technologies, looking for this way the but appropriate for the specific conditions of the region where it has planned the desalination of seawater. In this report the three technologies but promising for the desalination are described and by means of the DEEP code the costs of production of water and energy are evaluated, using as thermal source different types of power nuclear reactors. It was obtained according to DEEP that the costs of the electricity generation for the considered reactors are around 40 USD/MWh. With these costs of electric power generation and using the DEEP code is obtained that the costs of production of drinking water are around 1 USD/m 3 . (Author)

  19. On Coding Non-Contiguous Letter Combinations

    Directory of Open Access Journals (Sweden)

    Frédéric eDandurand

    2011-06-01

    Full Text Available Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity.

  20. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  1. Energy information data base: report number codes

    International Nuclear Information System (INIS)

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used

  2. Comparison of rate one-half, equivalent constraint length 24, binary convolutional codes for use with sequential decoding on the deep-space channel

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    Virtually all previously-suggested rate 1/2 binary convolutional codes with KE = 24 are compared. Their distance properties are given; and their performance, both in computation and in error probability, with sequential decoding on the deep-space channel is determined by simulation. Recommendations are made both for the choice of a specific KE = 24 code as well as for codes to be included in future coding standards for the deep-space channel. A new result given in this report is a method for determining the statistical significance of error probability data when the error probability is so small that it is not feasible to perform enough decoding simulations to obtain more than a very small number of decoding errors.

  3. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  4. Computer code for quantitative ALARA evaluations

    International Nuclear Information System (INIS)

    Voilleque, P.G.

    1984-01-01

    A FORTRAN computer code has been developed to simplify the determination of whether dose reduction actions meet the as low as is reasonably achievable (ALARA) criterion. The calculations are based on the methodology developed for the Atomic Industrial Forum. The code is used for analyses of eight types of dose reduction actions, characterized as follows: reduce dose rate, reduce job frequency, reduce productive working time, reduce crew size, increase administrative dose limit for the task, and increase the workers' time utilization and dose utilization through (a) improved working conditions, (b) basic skill training, or (c) refresher training for special skills. For each type of action, two analysis modes are available. The first is a generic analysis in which the program computes potential benefits (in dollars) for a range of possible improvements, e.g., for a range of lower dose rates. Generic analyses are most useful in the planning stage and for evaluating the general feasibility of alternative approaches. The second is a specific analysis in which the potential annual benefits of a specific level of improvement and the annual implementation cost are compared. The potential benefits reflect savings in operational and societal costs that can be realized if occupational radiation doses are reduced. Because the potential benefits depend upon many variables which characterize the job, the workplace, and the workers, there is no unique relationship between the potential dollar savings and the dose savings. The computer code permits rapid quantitative analyses of alternatives and is a tool that supplements the health physicist's professional judgment. The program output provides a rational basis for decision-making and a record of the assumptions employed

  5. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  6. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  7. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  8. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  9. Great Expectations: Is there Evidence for Predictive Coding in Auditory Cortex?

    Science.gov (United States)

    Heilbron, Micha; Chait, Maria

    2017-08-04

    Predictive coding is possibly one of the most influential, comprehensive, and controversial theories of neural function. While proponents praise its explanatory potential, critics object that key tenets of the theory are untested or even untestable. The present article critically examines existing evidence for predictive coding in the auditory modality. Specifically, we identify five key assumptions of the theory and evaluate each in the light of animal, human and modeling studies of auditory pattern processing. For the first two assumptions - that neural responses are shaped by expectations and that these expectations are hierarchically organized - animal and human studies provide compelling evidence. The anticipatory, predictive nature of these expectations also enjoys empirical support, especially from studies on unexpected stimulus omission. However, for the existence of separate error and prediction neurons, a key assumption of the theory, evidence is lacking. More work exists on the proposed oscillatory signatures of predictive coding, and on the relation between attention and precision. However, results on these latter two assumptions are mixed or contradictory. Looking to the future, more collaboration between human and animal studies, aided by model-based analyses will be needed to test specific assumptions and implementations of predictive coding - and, as such, help determine whether this popular grand theory can fulfill its expectations. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  10. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    Science.gov (United States)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  11. Cost reducing code implementation strategies

    International Nuclear Information System (INIS)

    Kurtz, Randall L.; Griswold, Michael E.; Jones, Gary C.; Daley, Thomas J.

    1995-01-01

    Sargent and Lundy's Code consulting experience reveals a wide variety of approaches toward implementing the requirements of various nuclear Codes Standards. This paper will describe various Code implementation strategies which assure that Code requirements are fully met in a practical and cost-effective manner. Applications to be discussed includes the following: new construction; repair, replacement and modifications; assessments and life extensions. Lessons learned and illustrative examples will be included. Preferred strategies and specific recommendations will also be addressed. Sargent and Lundy appreciates the opportunity provided by the Korea Atomic Industrial Forum and Korean Nuclear Society to share our ideas and enhance global cooperation through the exchange of information and views on relevant topics

  12. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Marshall, N.H.; Marwil, E.S.; Matthews, S.D.; Stacey, B.J.

    1990-01-01

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  13. Code of practice for clinical proton dosimetry

    International Nuclear Information System (INIS)

    Vynckier, S.

    1991-01-01

    The objective of this document is to make recommendations for the determination of absorbed dose to tissue for clinical proton beams and to achieve uniformity in proton dosimetry. A Code of Practice (CoP) has been chosen, providing specific guidelines for the choice of the detector and the method of determination of absorbed dose for proton beams only. This CoP is confined specifically to the determination of absorbed dose and is not concerned with the biological effects of proton beams. It is recommended that dosimeters be calibrated by comparison with a calorimeter. If this is not available, a Faraday cup, or alter-natively, an ionization chamber, with a 60 Co calibration factor should be used. Physical parameters for determining the dose from tissue-equivalent ionization chamber measurements are given together with a worksheet. It is recommended that calibrations be carried out in water at the centre of the spread-out-Bragg-peak and that dose distributions be measured in a water phantom. It is estimated that the error in the calibrations will be less than +-5 per cent (1 S.D.) in all cases. Adoption and implementation of this CoP will facilitate the exchange of clinical information. (author). 34 refs.; 5 figs.; 5 tabs

  14. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  15. Advantages of Westinghouse BWR control rod drop accidents methodology utilizing integrated POLCA-T code

    International Nuclear Information System (INIS)

    Panayotov, Dobromir

    2008-01-01

    The paper focuses on the activities pursued by Westinghouse in the development and licensing of POLCA-T code Control Rod Drop Accident (CRDA) Methodology. The comprehensive CRDA methodology that utilizes PHOENIX4/POLCA7/POLCA-T calculation chain foresees complete cycle-specific analysis. The methodology consists of determination of candidates of control rods (CR) that could cause a significant reactivity excursion if dropped throughout the entire fuel cycle, selection of limiting initial conditions for CRDA transient simulation and transient simulation itself. The Westinghouse methodology utilizes state-of-the-art methods. Unnecessary conservatisms in the methodology have been avoided to allow the accurate prediction of margin to design bases. This is mainly achieved by using the POLCA-T code for dynamic CRDA evaluations. The code belongs to the same calculation chain that is used for core design. Thus the very same reactor, core, cycle and fuel data base is used. This allows also reducing the uncertainties of input data and parameters that determine the energy deposition in the fuel. Uncertainty treatment, very selective use of conservatisms, selection of the initial conditions for limiting case analyses, incorporation into POLCA-T code models of the licensed fuel performance code are also among the means of performing realistic CRDA transient analyses. (author)

  16. Optimal Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1994-01-01

    Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....

  17. Combining specificity determining and conserved residues improves functional site prediction

    Directory of Open Access Journals (Sweden)

    Gelfand Mikhail S

    2009-06-01

    Full Text Available Abstract Background Predicting the location of functionally important sites from protein sequence and/or structure is a long-standing problem in computational biology. Most current approaches make use of sequence conservation, assuming that amino acid residues conserved within a protein family are most likely to be functionally important. Most often these approaches do not consider many residues that act to define specific sub-functions within a family, or they make no distinction between residues important for function and those more relevant for maintaining structure (e.g. in the hydrophobic core. Many protein families bind and/or act on a variety of ligands, meaning that conserved residues often only bind a common ligand sub-structure or perform general catalytic activities. Results Here we present a novel method for functional site prediction based on identification of conserved positions, as well as those responsible for determining ligand specificity. We define Specificity-Determining Positions (SDPs, as those occupied by conserved residues within sub-groups of proteins in a family having a common specificity, but differ between groups, and are thus likely to account for specific recognition events. We benchmark the approach on enzyme families of known 3D structure with bound substrates, and find that in nearly all families residues predicted by SDPsite are in contact with the bound substrate, and that the addition of SDPs significantly improves functional site prediction accuracy. We apply SDPsite to various families of proteins containing known three-dimensional structures, but lacking clear functional annotations, and discusse several illustrative examples. Conclusion The results suggest a better means to predict functional details for the thousands of protein structures determined prior to a clear understanding of molecular function.

  18. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  19. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  20. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  1. Code Calibration as a Decision Problem

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1993-01-01

    Calibration of partial coefficients for a class of structures where no code exists is considered. The partial coefficients are determined such that the difference between the reliability for the different structures in the class considered and a target reliability level is minimized. Code...... calibration on a decision theoretical basis is discussed. Results from code calibration for rubble mound breakwater designs are shown....

  2. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  3. User Effect on Code Application and Qualification Needs

    International Nuclear Information System (INIS)

    D'Auria, F.; Salah, A.B.

    2008-01-01

    Experience with some code assessment case studies and also additional ISPs have shown the dominant effect of the code user on the predicted system behavior. The general findings of the user effect investigations on some of the case studies indicate, specifically, that in addition to user effects, there are other reasons which affect the results of the calculations and are hidden under the general title of user effects. The specific characteristics of experimental facilities, i.e. limitations as far as code assessment is concerned; limitations of the used thermal-hydraulic codes to simulate certain system behavior or phenomena; limitations due to interpretation of experimental data by the code user, i.e. interpretation of experimental data base. On the basis of the discussions in this paper, the following conclusions and recommendations can be made: More dialogue appears to be necessary with the experimenters in the planning of code assessment calculations, e.g. ISPs.; User guidelines are not complete for the codes and the lack of sufficient and detailed user guidelines are observed with some of the case studies; More extensive user instruction and training, improved user guidelines, or quality assurance procedures may partially reduce some of the subjective user influence on the calculated results; The discrepancies between experimental data and code predictions are due both to the intrinsic code limit and to the so called user effects. There is a worthful need to quantify the percentage of disagreement due to the poor utilization of the code and due to the code itself. This need especially arises for the uncertainty evaluation studies (e.g. [18]) which do not take into account the mentioned user effects; A much focused investigation, based on the results of comparison calculations e.g. ISPs, analyzing the experimental data and the results of the specific code in order to evaluate the user effects and the related experimental aspects should be integral part of the

  4. Monte Carlo code for neutron radiography

    International Nuclear Information System (INIS)

    Milczarek, Jacek J.; Trzcinski, Andrzej; El-Ghany El Abd, Abd; Czachor, Andrzej

    2005-01-01

    The concise Monte Carlo code, MSX, for simulation of neutron radiography images of non-uniform objects is presented. The possibility of modeling the images of objects with continuous spatial distribution of specific isotopes is included. The code can be used for assessment of the scattered neutron component in neutron radiograms

  5. Monte Carlo code for neutron radiography

    Energy Technology Data Exchange (ETDEWEB)

    Milczarek, Jacek J. [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland)]. E-mail: jjmilcz@cyf.gov.pl; Trzcinski, Andrzej [Institute for Nuclear Studies, Swierk, 05-400 Otwock (Poland); El-Ghany El Abd, Abd [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland); Nuclear Research Center, PC 13759, Cairo (Egypt); Czachor, Andrzej [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland)

    2005-04-21

    The concise Monte Carlo code, MSX, for simulation of neutron radiography images of non-uniform objects is presented. The possibility of modeling the images of objects with continuous spatial distribution of specific isotopes is included. The code can be used for assessment of the scattered neutron component in neutron radiograms.

  6. Non-binary Entanglement-assisted Stabilizer Quantum Codes

    OpenAIRE

    Riguang, Leng; Zhi, Ma

    2011-01-01

    In this paper, we show how to construct non-binary entanglement-assisted stabilizer quantum codes by using pre-shared entanglement between the sender and receiver. We also give an algorithm to determine the circuit for non-binary entanglement-assisted stabilizer quantum codes and some illustrated examples. The codes we constructed do not require the dual-containing constraint, and many non-binary classical codes, like non-binary LDPC codes, which do not satisfy the condition, can be used to c...

  7. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  8. Identification of coding and non-coding mutational hotspots in cancer genomes.

    Science.gov (United States)

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  9. A method for scientific code coupling in a distributed environment

    International Nuclear Information System (INIS)

    Caremoli, C.; Beaucourt, D.; Chen, O.; Nicolas, G.; Peniguel, C.; Rascle, P.; Richard, N.; Thai Van, D.; Yessayan, A.

    1994-12-01

    This guide book deals with coupling of big scientific codes. First, the context is introduced: big scientific codes devoted to a specific discipline coming to maturity, and more and more needs in terms of multi discipline studies. Then we describe different kinds of code coupling and an example of code coupling: 3D thermal-hydraulic code THYC and 3D neutronics code COCCINELLE. With this example we identify problems to be solved to realize a coupling. We present the different numerical methods usable for the resolution of coupling terms. This leads to define two kinds of coupling: with the leak coupling, we can use explicit methods, and with the strong coupling we need to use implicit methods. On both cases, we analyze the link with the way of parallelizing code. For translation of data from one code to another, we define the notion of Standard Coupling Interface based on a general structure for data. This general structure constitutes an intermediary between the codes, thus allowing a relative independence of the codes from a specific coupling. The proposed method for the implementation of a coupling leads to a simultaneous run of the different codes, while they exchange data. Two kinds of data communication with message exchange are proposed: direct communication between codes with the use of PVM product (Parallel Virtual Machine) and indirect communication with a coupling tool. This second way, with a general code coupling tool, is based on a coupling method, and we strongly recommended to use it. This method is based on the two following principles: re-usability, that means few modifications on existing codes, and definition of a code usable for coupling, that leads to separate the design of a code usable for coupling from the realization of a specific coupling. This coupling tool available from beginning of 1994 is described in general terms. (authors). figs., tabs

  10. QR Codes in the Library: Are They Worth the Effort? Analysis of a QR Code Pilot Project

    OpenAIRE

    Wilson, Andrew M.

    2012-01-01

    The literature is filled with potential uses for Quick Response (QR) codes in the library. Setting, but few library QR code projects have publicized usage statistics. A pilot project carried out in the Eda Kuhn Loeb Music Library of the Harvard College Library sought to determine whether library patrons actually understand and use QR codes. Results and analysis of the pilot project are provided, attempting to answer the question as to whether QR codes are worth the effort for libraries.

  11. QUIL: a chemical equilibrium code

    International Nuclear Information System (INIS)

    Lunsford, J.L.

    1977-02-01

    A chemical equilibrium code QUIL is described, along with two support codes FENG and SURF. QUIL is designed to allow calculations on a wide range of chemical environments, which may include surface phases. QUIL was written specifically to calculate distributions associated with complex equilibria involving fission products in the primary coolant loop of the high-temperature gas-cooled reactor. QUIL depends upon an energy-data library called ELIB. This library is maintained by FENG and SURF. FENG enters into the library all reactions having standard free energies of reaction that are independent of concentration. SURF enters all surface reactions into ELIB. All three codes are interactive codes written to be used from a remote terminal, with paging control provided. Plotted output is also available

  12. Sensitivity and specificity of copper sulphate test in determining ...

    African Journals Online (AJOL)

    Background: The accuracy of the copper sulphate method for the rapid screening of prospective blood donors has been questioned because this rapid screening method may lead to false deferral of truly eligible prospective blood donors. Objective: This study was aimed at determining the sensitivity and specificity of copper ...

  13. A compendium of computer codes in fault tree analysis

    International Nuclear Information System (INIS)

    Lydell, B.

    1981-03-01

    In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)

  14. Analysis of pipe stress using CAESAR II code

    International Nuclear Information System (INIS)

    Sitandung, Y.B.; Bandriyana, B.

    2002-01-01

    Analysis of this piping stress with the purpose of knowing stress distribution piping system in order to determine pipe supports configuration. As an example of analysis, Gas Exchanger to Warm Separator Line was chosen with, input data was firstly prepared in a document, i.e. piping analysis specification that its content named as pipe characteristics, material properties, operation conditions, guide equipment's and so on. Analysis result such as stress, load, displacement and the use support type were verified based on requirements in the code, standard, and regularities were suitable with piping system condition analyzed. As the proof that piping system is in safety condition, it can be indicated from analysis results (actual loads) which still under allowable load. From the analysis steps that have been done CAESAR II code fulfill requirements to be used as a tool of piping stress analysis as well as nuclear and non nuclear installation piping system

  15. New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin

    2017-03-01

    This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.

  16. Specific structural probing of plasmid-coded ribosomal RNAs from Escherichia coli

    DEFF Research Database (Denmark)

    Aagaard, C; Rosendahl, G; Dam, M

    1991-01-01

    The preferred method for construction and in vivo expression of mutagenised Escherichia coli ribosomal RNAs (rRNAs) is via high copy number plasmids. Transcription of wild-type rRNA from the seven chromosomal rrn operons in strains harbouring plasmid-coded mutant rRNAs leads to a heterogeneous...

  17. New tools to analyze overlapping coding regions.

    Science.gov (United States)

    Bayegan, Amir H; Garcia-Martin, Juan Antonio; Clote, Peter

    2016-12-13

    Retroviruses transcribe messenger RNA for the overlapping Gag and Gag-Pol polyproteins, by using a programmed -1 ribosomal frameshift which requires a slippery sequence and an immediate downstream stem-loop secondary structure, together called frameshift stimulating signal (FSS). It follows that the molecular evolution of this genomic region of HIV-1 is highly constrained, since the retroviral genome must contain a slippery sequence (sequence constraint), code appropriate peptides in reading frames 0 and 1 (coding requirements), and form a thermodynamically stable stem-loop secondary structure (structure requirement). We describe a unique computational tool, RNAsampleCDS, designed to compute the number of RNA sequences that code two (or more) peptides p,q in overlapping reading frames, that are identical (or have BLOSUM/PAM similarity that exceeds a user-specified value) to the input peptides p,q. RNAsampleCDS then samples a user-specified number of messenger RNAs that code such peptides; alternatively, RNAsampleCDS can exactly compute the position-specific scoring matrix and codon usage bias for all such RNA sequences. Our software allows the user to stipulate overlapping coding requirements for all 6 possible reading frames simultaneously, even allowing IUPAC constraints on RNA sequences and fixing GC-content. We generalize the notion of codon preference index (CPI) to overlapping reading frames, and use RNAsampleCDS to generate control sequences required in the computation of CPI. Moreover, by applying RNAsampleCDS, we are able to quantify the extent to which the overlapping coding requirement in HIV-1 [resp. HCV] contribute to the formation of the stem-loop [resp. double stem-loop] secondary structure known as the frameshift stimulating signal. Using our software, we confirm that certain experimentally determined deleterious HCV mutations occur in positions for which our software RNAsampleCDS and RNAiFold both indicate a single possible nucleotide. We

  18. Energy Code Enforcement Training Manual : Covering the Washington State Energy Code and the Ventilation and Indoor Air Quality Code.

    Energy Technology Data Exchange (ETDEWEB)

    Washington State Energy Code Program

    1992-05-01

    This manual is designed to provide building department personnel with specific inspection and plan review skills and information on provisions of the 1991 edition of the Washington State Energy Code (WSEC). It also provides information on provisions of the new stand-alone Ventilation and Indoor Air Quality (VIAQ) Code.The intent of the WSEC is to reduce the amount of energy used by requiring energy-efficient construction. Such conservation reduces energy requirements, and, as a result, reduces the use of finite resources, such as gas or oil. Lowering energy demand helps everyone by keeping electricity costs down. (It is less expensive to use existing electrical capacity efficiently than it is to develop new and additional capacity needed to heat or cool inefficient buildings.) The new VIAQ Code (effective July, 1991) is a natural companion to the energy code. Whether energy-efficient or not, an homes have potential indoor air quality problems. Studies have shown that indoor air is often more polluted than outdoor air. The VIAQ Code provides a means of exchanging stale air for fresh, without compromising energy savings, by setting standards for a controlled ventilation system. It also offers requirements meant to prevent indoor air pollution from building products or radon.

  19. A proto-code of ethics and conduct for European nurse directors.

    Science.gov (United States)

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  20. Duals of Affine Grassmann Codes and Their Relatives

    DEFF Research Database (Denmark)

    Beelen, P.; Ghorpade, S. R.; Hoholdt, T.

    2012-01-01

    Affine Grassmann codes are a variant of generalized Reed-Muller codes and are closely related to Grassmann codes. These codes were introduced in a recent work by Beelen Here, we consider, more generally, affine Grassmann codes of a given level. We explicitly determine the dual of an affine...... Grassmann code of any level and compute its minimum distance. Further, we ameliorate the results by Beelen concerning the automorphism group of affine Grassmann codes. Finally, we prove that affine Grassmann codes and their duals have the property that they are linear codes generated by their minimum......-weight codewords. This provides a clean analogue of a corresponding result for generalized Reed-Muller codes....

  1. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  2. Roadmap for the Future of Commercial Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hart, Philip R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Jian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-01

    Building energy codes have significantly increased building efficiency over the last 38 years, since the first national energy code was published in 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. The current focus on prescriptive codes has limitations including significant variation in actual energy performance depending on which prescriptive options are chosen, a lack of flexibility for designers and developers, the inability to handle optimization that is specific to building type and use, the inability to account for project-specific energy costs, and the lack of follow-through or accountability after a certificate of occupancy is granted. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. This report provides a high-level review of different formats for commercial building energy codes, including prescriptive, prescriptive packages, capacity constrained, outcome based, and predictive performance approaches. This report also explores a next generation commercial energy code approach that places a greater emphasis on performance-based criteria.

  3. Working memory templates are maintained as feature-specific perceptual codes.

    Science.gov (United States)

    Sreenivasan, Kartik K; Sambhara, Deepak; Jha, Amishi P

    2011-07-01

    Working memory (WM) representations serve as templates that guide behavior, but the neural basis of these templates remains elusive. We tested the hypothesis that WM templates are maintained by biasing activity in sensoriperceptual neurons that code for features of items being held in memory. Neural activity was recorded using event-related potentials (ERPs) as participants viewed a series of faces and responded when a face matched a target face held in WM. Our prediction was that if activity in neurons coding for the features of the target is preferentially weighted during maintenance of the target, then ERP activity evoked by a nontarget probe face should be commensurate with the visual similarity between target and probe. Visual similarity was operationalized as the degree of overlap in visual features between target and probe. A face-sensitive ERP response was modulated by target-probe similarity. Amplitude was largest for probes that were similar to the target, and decreased monotonically as a function of decreasing target-probe similarity. These results indicate that neural activity is weighted in favor of visual features that comprise an actively held memory representation. As such, our findings support the notion that WM templates rely on neural populations involved in forming percepts of memory items.

  4. Experiment-specific analyses in support of code development

    International Nuclear Information System (INIS)

    Ott, L.J.

    1990-01-01

    Experiment-specific models have been developed since 1986 by Oak Ridge National Laboratory Boiling Water Reactor (BWR) severe accident analysis programs for the purpose of BWR experimental planning and optimum interpretation of experimental results. These experiment-specific models have been applied to large integral tests (ergo, experiments) which start from an initial undamaged core state. The tests performed to date in BWR geometry have had significantly different-from-prototypic boundary and experimental conditions because of either normal facility limitations or specific experimental constraints. These experiments (ACRR: DF-4, NRU: FLHT-6, and CORA) were designed to obtain specific phenomenological information such as the degradation and interaction of prototypic components and the effects on melt progression of control-blade materials and channel boxes. Applications of ORNL models specific to the ACRR DF-4 and KfK CORA-16 experiments are discussed and significant findings from the experimental analyses are presented. 32 refs., 16 figs

  5. Visual Coding of Human Bodies: Perceptual Aftereffects Reveal Norm-Based, Opponent Coding of Body Identity

    Science.gov (United States)

    Rhodes, Gillian; Jeffery, Linda; Boeing, Alexandra; Calder, Andrew J.

    2013-01-01

    Despite the discovery of body-selective neural areas in occipitotemporal cortex, little is known about how bodies are visually coded. We used perceptual adaptation to determine how body identity is coded. Brief exposure to a body (e.g., anti-Rose) biased perception toward an identity with opposite properties (Rose). Moreover, the size of this…

  6. On Field Size and Success Probability in Network Coding

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Matsumoto, Ryutaroh; Thomsen, Casper

    2008-01-01

    Using tools from algebraic geometry and Gröbner basis theory we solve two problems in network coding. First we present a method to determine the smallest field size for which linear network coding is feasible. Second we derive improved estimates on the success probability of random linear network...... coding. These estimates take into account which monomials occur in the support of the determinant of the product of Edmonds matrices. Therefore we finally investigate which monomials can occur in the determinant of the Edmonds matrix....

  7. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  8. Coding conventions and principles for a National Land-Change Modeling Framework

    Science.gov (United States)

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  9. RADTRAN II: revised computer code to analyze transportation of radioactive material

    International Nuclear Information System (INIS)

    Taylor, J.M.; Daniel, S.L.

    1982-10-01

    A revised and updated version of the RADTRAN computer code is presented. This code has the capability to predict the radiological impacts associated with specific schemes of radioactive material shipments and mode specific transport variables

  10. Calibration Methods for Reliability-Based Design Codes

    DEFF Research Database (Denmark)

    Gayton, N.; Mohamed, A.; Sørensen, John Dalsgaard

    2004-01-01

    The calibration methods are applied to define the optimal code format according to some target safety levels. The calibration procedure can be seen as a specific optimization process where the control variables are the partial factors of the code. Different methods are available in the literature...

  11. Multirate Filter Bank Representations of RS and BCH Codes

    Directory of Open Access Journals (Sweden)

    Van Meerbergen Geert

    2008-01-01

    Full Text Available Abstract This paper addresses the use of multirate filter banks in the context of error-correction coding. An in-depth study of these filter banks is presented, motivated by earlier results and applications based on the filter bank representation of Reed-Solomon (RS codes, such as Soft-In Soft-Out RS-decoding or RS-OFDM. The specific structure of the filter banks (critical subsampling is an important aspect in these applications. The goal of the paper is twofold. First, the filter bank representation of RS codes is now explained based on polynomial descriptions. This approach allows us to gain new insight in the correspondence between RS codes and filter banks. More specifically, it allows us to show that the inherent periodically time-varying character of a critically subsampled filter bank matches remarkably well with the cyclic properties of RS codes. Secondly, an extension of these techniques toward the more general class of BCH codes is presented. It is demonstrated that a BCH code can be decomposed into a sum of critically subsampled filter banks.

  12. Multirate Filter Bank Representations of RS and BCH Codes

    Directory of Open Access Journals (Sweden)

    Marc Moonen

    2009-01-01

    Full Text Available This paper addresses the use of multirate filter banks in the context of error-correction coding. An in-depth study of these filter banks is presented, motivated by earlier results and applications based on the filter bank representation of Reed-Solomon (RS codes, such as Soft-In Soft-Out RS-decoding or RS-OFDM. The specific structure of the filter banks (critical subsampling is an important aspect in these applications. The goal of the paper is twofold. First, the filter bank representation of RS codes is now explained based on polynomial descriptions. This approach allows us to gain new insight in the correspondence between RS codes and filter banks. More specifically, it allows us to show that the inherent periodically time-varying character of a critically subsampled filter bank matches remarkably well with the cyclic properties of RS codes. Secondly, an extension of these techniques toward the more general class of BCH codes is presented. It is demonstrated that a BCH code can be decomposed into a sum of critically subsampled filter banks.

  13. Technical specifications requirements: Automated reasoning applications

    International Nuclear Information System (INIS)

    Lidsky, L.M.; Dobrzeniecki, A.B.

    1990-03-01

    Several software systems were developed and tested to determine what advantages could be gained from explicitly translating complicated regulatory requirements into computerized relationships. The Technical Specifications for US nuclear power plants were chosen as the test-bed application domain, and two analysis systems were developed to monitor plant compliance with operational limits, and track and schedule equipment test and maintenance activities mandated by Technical Specifications. Choosing PROLOG as the computer language to represent these regulatory requirements resulted in a natural match between the semantic structure of the written specifications and the corollary coded rules. Additional research results affirmed the utility of declarative programming styles, explicit management of problem complexity, and attention to the robustness and flexibility of the overall software systems. 5 refs., 2 figs

  14. Audit of Clinical Coding of Major Head and Neck Operations

    Science.gov (United States)

    Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean

    2009-01-01

    INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944

  15. Biomass Determination Using Wood Specific Gravity from Increment Cores

    Science.gov (United States)

    Michael C. Wiemann; G. Bruce Williamson

    2013-01-01

    Wood specific gravity (SG) is one of the most important variables used to determine biomass. Measurement of SG is problematic because it requires tedious, and often difficult, sampling of wood from standing trees. Sampling is complicated because the SG usually varies nonrandomly within trees, resulting in systematic errors. Off-center pith and hollow or decayed stems...

  16. An evaluation of the effectiveness of the EPA comply code to demonstrate compliance with radionuclide emission standards at three manufacturing facilities

    International Nuclear Information System (INIS)

    Smith, L.R.; Laferriere, J.R.; Nagy, J.W.

    1991-01-01

    Measurements of airborne radionuclide emissions and associated environmental concentrations were made at, and in the vicinity of, two urban and one suburban facility where radiolabeled chemicals for biomedical research and radiopharmaceuticals are manufactured. Emission, environmental and meteorological measurements were used in the EPA COMPLY code and in environmental assessment models developed specifically for these sites to compare their ability to predict off-site measurements. The models and code were then used to determine potential dose to hypothetical maximally exposed receptors and the ability of these methods to demonstrate whether these facilities comply with proposed radionuclide emission standards assessed. In no case did the models and code seriously underestimate off-site impacts. However, for certain radionuclides and chemical forms, the EPA COMPLY code was found to overestimate off-site impacts by such a large factor as to render its value questionable for determining regulatory compliance. Recommendations are offered for changing the code to enable it to be more serviceable to radionuclide users and regulators

  17. Predictive Bias and Sensitivity in NRC Fuel Performance Codes

    Energy Technology Data Exchange (ETDEWEB)

    Geelhood, Kenneth J.; Luscher, Walter G.; Senor, David J.; Cunningham, Mitchel E.; Lanning, Donald D.; Adkins, Harold E.

    2009-10-01

    The latest versions of the fuel performance codes, FRAPCON-3 and FRAPTRAN were examined to determine if the codes are intrinsically conservative. Each individual model and type of code prediction was examined and compared to the data that was used to develop the model. In addition, a brief literature search was performed to determine if more recent data have become available since the original model development for model comparison.

  18. Species specificity in major urinary proteins by parallel evolution.

    Directory of Open Access Journals (Sweden)

    Darren W Logan

    Full Text Available Species-specific chemosignals, pheromones, regulate social behaviors such as aggression, mating, pup-suckling, territory establishment, and dominance. The identity of these cues remains mostly undetermined and few mammalian pheromones have been identified. Genetically-encoded pheromones are expected to exhibit several different mechanisms for coding 1 diversity, to enable the signaling of multiple behaviors, 2 dynamic regulation, to indicate age and dominance, and 3 species-specificity. Recently, the major urinary proteins (Mups have been shown to function themselves as genetically-encoded pheromones to regulate species-specific behavior. Mups are multiple highly related proteins expressed in combinatorial patterns that differ between individuals, gender, and age; which are sufficient to fulfill the first two criteria. We have now characterized and fully annotated the mouse Mup gene content in detail. This has enabled us to further analyze the extent of Mup coding diversity and determine their potential to encode species-specific cues.Our results show that the mouse Mup gene cluster is composed of two subgroups: an older, more divergent class of genes and pseudogenes, and a second class with high sequence identity formed by recent sequential duplications of a single gene/pseudogene pair. Previous work suggests that truncated Mup pseudogenes may encode a family of functional hexapeptides with the potential for pheromone activity. Sequence comparison, however, reveals that they have limited coding potential. Similar analyses of nine other completed genomes find Mup gene expansions in divergent lineages, including those of rat, horse and grey mouse lemur, occurring independently from a single ancestral Mup present in other placental mammals. Our findings illustrate that increasing genomic complexity of the Mup gene family is not evolutionarily isolated, but is instead a recurring mechanism of generating coding diversity consistent with a species-specific

  19. Computer codes for level 1 probabilistic safety assessment

    International Nuclear Information System (INIS)

    1990-06-01

    Probabilistic Safety Assessment (PSA) entails several laborious tasks suitable for computer codes assistance. This guide identifies these tasks, presents guidelines for selecting and utilizing computer codes in the conduct of the PSA tasks and for the use of PSA results in safety management and provides information on available codes suggested or applied in performing PSA in nuclear power plants. The guidance is intended for use by nuclear power plant system engineers, safety and operating personnel, and regulators. Large efforts are made today to provide PC-based software systems and PSA processed information in a way to enable their use as a safety management tool by the nuclear power plant overall management. Guidelines on the characteristics of software needed for management to prepare a software that meets their specific needs are also provided. Most of these computer codes are also applicable for PSA of other industrial facilities. The scope of this document is limited to computer codes used for the treatment of internal events. It does not address other codes available mainly for the analysis of external events (e.g. seismic analysis) flood and fire analysis. Codes discussed in the document are those used for probabilistic rather than for phenomenological modelling. It should be also appreciated that these guidelines are not intended to lead the user to selection of one specific code. They provide simply criteria for the selection. Refs and tabs

  20. What determines the informativeness of firms' explanations for deviations from the Dutch corporate governance code?

    NARCIS (Netherlands)

    Hooghiemstra, R.B.H.

    2012-01-01

    The comply-or-explain principle is a common feature of corporate governance codes. While prior studies investigated compliance with corporate governance codes as well as the effects of compliance on firm behaviour and performance, explanations for deviations from a corporate governance code remain

  1. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    Science.gov (United States)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to

  2. The use of the SRIM code for calculation of radiation damage induced by neutrons

    Science.gov (United States)

    Mohammadi, A.; Hamidi, S.; Asadabad, Mohsen Asadi

    2017-12-01

    Materials subjected to neutron irradiation will being evolve to structural changes by the displacement cascades initiated by nuclear reaction. This study discusses a methodology to compute primary knock-on atoms or PKAs information that lead to radiation damage. A program AMTRACK has been developed for assessing of the PKAs information. This software determines the specifications of recoil atoms (using PTRAC card of MCNPX code) and also the kinematics of interactions. The deterministic method was used for verification of the results of (MCNPX+AMTRACK). The SRIM (formely TRIM) code is capable to compute neutron radiation damage. The PKAs information was extracted by AMTRACK program, which can be used as an input of SRIM codes for systematic analysis of primary radiation damage. Then the Bushehr Nuclear Power Plant (BNPP) radiation damage on reactor pressure vessel is calculated.

  3. Circular codes revisited: a statistical approach.

    Science.gov (United States)

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Using Administrative Mental Health Indicators in Heart Failure Outcomes Research: Comparison of Clinical Records and International Classification of Disease Coding.

    Science.gov (United States)

    Bender, Miriam; Smith, Tyler C

    2016-01-01

    Use of mental indication in health outcomes research is of growing interest to researchers. This study, as part of a larger research program, quantified agreement between administrative International Classification of Disease (ICD-9) coding for, and "gold standard" clinician documentation of, mental health issues (MHIs) in hospitalized heart failure (HF) patients to determine the validity of mental health administrative data for use in HF outcomes research. A 13% random sample (n = 504) was selected from all unique patients (n = 3,769) hospitalized with a primary HF diagnosis at 4 San Diego County community hospitals during 2009-2012. MHI was defined as ICD-9 discharge diagnostic coding 290-319. Records were audited for clinician documentation of MHI. A total of 43% (n = 216) had mental health clinician documentation; 33% (n = 164) had ICD-9 coding for MHI. ICD-9 code bundle 290-319 had 0.70 sensitivity, 0.97 specificity, and kappa 0.69 (95% confidence interval 0.61-0.79). More specific ICD-9 MHI code bundles had kappas ranging from 0.44 to 0.82 and sensitivities ranging from 42% to 82%. Agreement between ICD-9 coding and clinician documentation for a broadly defined MHI is substantial, and can validly "rule in" MHI for hospitalized patients with heart failure. More specific MHI code bundles had fair to almost perfect agreement, with a wide range of sensitivities for identifying patients with an MHI. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  6. The Purine Bias of Coding Sequences is Determined by Physicochemical Constraints on Proteins.

    Science.gov (United States)

    Ponce de Leon, Miguel; de Miranda, Antonio Basilio; Alvarez-Valin, Fernando; Carels, Nicolas

    2014-01-01

    For this report, we analyzed protein secondary structures in relation to the statistics of three nucleotide codon positions. The purpose of this investigation was to find which properties of the ribosome, tRNA or protein level, could explain the purine bias (Rrr) as it is observed in coding DNA. We found that the Rrr pattern is the consequence of a regularity (the codon structure) resulting from physicochemical constraints on proteins and thermodynamic constraints on ribosomal machinery. The physicochemical constraints on proteins mainly come from the hydropathy and molecular weight (MW) of secondary structures as well as the energy cost of amino acid synthesis. These constraints appear through a network of statistical correlations, such as (i) the cost of amino acid synthesis, which is in favor of a higher level of guanine in the first codon position, (ii) the constructive contribution of hydropathy alternation in proteins, (iii) the spatial organization of secondary structure in proteins according to solvent accessibility, (iv) the spatial organization of secondary structure according to amino acid hydropathy, (v) the statistical correlation of MW with protein secondary structures and their overall hydropathy, (vi) the statistical correlation of thymine in the second codon position with hydropathy and the energy cost of amino acid synthesis, and (vii) the statistical correlation of adenine in the second codon position with amino acid complexity and the MW of secondary protein structures. Amino acid physicochemical properties and functional constraints on proteins constitute a code that is translated into a purine bias within the coding DNA via tRNAs. In that sense, the Rrr pattern within coding DNA is the effect of information transfer on nucleotide composition from protein to DNA by selection according to the codon positions. Thus, coding DNA structure and ribosomal machinery co-evolved to minimize the energy cost of protein coding given the functional

  7. A graph model for opportunistic network coding

    KAUST Repository

    Sorour, Sameh

    2015-08-12

    © 2015 IEEE. Recent advancements in graph-based analysis and solutions of instantly decodable network coding (IDNC) trigger the interest to extend them to more complicated opportunistic network coding (ONC) scenarios, with limited increase in complexity. In this paper, we design a simple IDNC-like graph model for a specific subclass of ONC, by introducing a more generalized definition of its vertices and the notion of vertex aggregation in order to represent the storage of non-instantly-decodable packets in ONC. Based on this representation, we determine the set of pairwise vertex adjacency conditions that can populate this graph with edges so as to guarantee decodability or aggregation for the vertices of each clique in this graph. We then develop the algorithmic procedures that can be applied on the designed graph model to optimize any performance metric for this ONC subclass. A case study on reducing the completion time shows that the proposed framework improves on the performance of IDNC and gets very close to the optimal performance.

  8. A reflexive exploration of two qualitative data coding techniques

    Directory of Open Access Journals (Sweden)

    Erik Blair

    2016-01-01

    Full Text Available In an attempt to help find meaning within qualitative data, researchers commonly start by coding their data. There are a number of coding systems available to researchers and this reflexive account explores my reflections on the use of two such techniques. As part of a larger investigation, two pilot studies were undertaken as a means to examine the relative merits of open coding and template coding for examining transcripts. This article does not describe the research project per se but attempts to step back and offer a reflexive account of the development of data coding tools. Here I reflect upon and evaluate the two data coding techniques that were piloted, and discuss how using appropriate aspects of both led to the development of my final data coding approach. My exploration found there was no clear-cut ‘best’ option but that the data coding techniques needed to be reflexively-aligned to meet the specific needs of my project. This reflection suggests that, when coding qualitative data, researchers should be methodologically thoughtful when they attempt to apply any data coding technique; that they do not assume pre-established tools are aligned to their particular paradigm; and that they consider combining and refining established techniques as a means to define their own specific codes. DOI: 10.2458/azu_jmmss.v6i1.18772DOI: 10.2458/azu_jmmss.v6i1.18772

  9. CRUCIB: an axisymmetric convection code

    International Nuclear Information System (INIS)

    Bertram, L.A.

    1975-03-01

    The CRUCIB code was written in support of an experimental program aimed at measurement of thermal diffusivities of refractory liquids. Precise values of diffusivity are necessary to realistic analysis of reactor safety problems, nuclear waste disposal procedures, and fundamental metal forming processes. The code calculates the axisymmetric transient convective motions produced in a right circular cylindrical crucible, which is surface heated by an annular heat pulse. Emphasis of this report is placed on the input-output options of the CRUCIB code, which are tailored to assess the importance of the convective heat transfer in determining the surface temperature distribution. Use is limited to Prandtl numbers less than unity; larger values can be accommodated by replacement of a single block of the code, if desired. (U.S.)

  10. The complete mitochondrial genome of the land snail Cornu aspersum (Helicidae: Mollusca: intra-specific divergence of protein-coding genes and phylogenetic considerations within Euthyneura.

    Directory of Open Access Journals (Sweden)

    Juan Diego Gaitán-Espitia

    Full Text Available The complete sequences of three mitochondrial genomes from the land snail Cornu aspersum were determined. The mitogenome has a length of 14050 bp, and it encodes 13 protein-coding genes, 22 transfer RNA genes and two ribosomal RNA genes. It also includes nine small intergene spacers, and a large AT-rich intergenic spacer. The intra-specific divergence analysis revealed that COX1 has the lower genetic differentiation, while the most divergent genes were NADH1, NADH3 and NADH4. With the exception of Euhadra herklotsi, the structural comparisons showed the same gene order within the family Helicidae, and nearly identical gene organization to that found in order Pulmonata. Phylogenetic reconstruction recovered Basommatophora as polyphyletic group, whereas Eupulmonata and Pulmonata as paraphyletic groups. Bayesian and Maximum Likelihood analyses showed that C. aspersum is a close relative of Cepaea nemoralis, and with the other Helicidae species form a sister group of Albinaria caerulea, supporting the monophyly of the Stylommatophora clade.

  11. HELIAS module development for systems codes

    Energy Technology Data Exchange (ETDEWEB)

    Warmer, F., E-mail: Felix.Warmer@ipp.mpg.de; Beidler, C.D.; Dinklage, A.; Egorov, K.; Feng, Y.; Geiger, J.; Schauer, F.; Turkin, Y.; Wolf, R.; Xanthopoulos, P.

    2015-02-15

    In order to study and design next-step fusion devices such as DEMO, comprehensive systems codes are commonly employed. In this work HELIAS-specific models are proposed which are designed to be compatible with systems codes. The subsequently developed models include: a geometry model based on Fourier coefficients which can represent the complex 3-D plasma shape, a basic island divertor model which assumes diffusive cross-field transport and high radiation at the X-point, and a coil model which combines scaling aspects based on the HELIAS 5-B reactor design in combination with analytic inductance and field calculations. In addition, stellarator-specific plasma transport is discussed. A strategy is proposed which employs a predictive confinement time scaling derived from 1-D neoclassical and 3-D turbulence simulations. This paper reports on the progress of the development of the stellarator-specific models while an implementation and verification study within an existing systems code will be presented in a separate work. This approach is investigated to ultimately allow one to conduct stellarator system studies, develop design points of HELIAS burning plasma devices, and to facilitate a direct comparison between tokamak and stellarator DEMO and power plant designs.

  12. The VULKIN code used for evaluation of the cladding tube's performance

    International Nuclear Information System (INIS)

    Marbach, G.

    1979-01-01

    Full text: 1 - Introduction. The French approach for fast subassembly project is to analyse each component part of the subassembly and each basic phenomenon to estimate the total behaviour. The VULKIN code describes the mechanical behaviour of a clad alone. A cladding damage parameter is calculated from the observed deformations. When it is greater than a fixed value we consider that the rupture probability is not negligible. But this function is not the only limit for the irradiation project. Other limits are bound to other problems: no fuel melting bundle, interaction behaviour. 2 - VULKIN code - Presentation. The VULKIN code gives the evolution of stresses and strains distribution in the thickness of the clad with the hypothesis of revolution symmetry. This program takes into account temperature dilatation and radial thermal gradient, fission gas pressure and steel swelling due to neutron flux. The fuel clad mechanical interaction is not described by this model. Experimental results show that its influence is negligible for the most unusual subassemblies but, if it is necessary, a special calculation is obtained using a specific code like TUREN, described in another paper. This model does not consider the stresses and strains resulting from interaction between bundle and wrapper. Another model describes the bundle behaviour and determines diametral deformation limit from the subassembly geometrical characteristics. The clad is considered as an elasto-plastic element. Plastic flows instantaneous, thermal creep or irradiation creep are determined at each time. The data of this code are the geometry, the irradiation parameters (temperature, dose), the fission gas pressure evolution, the swelling law and the experimental relations for thermal and irradiation creep. The mechanical resolution is classical: the clad is divided into concentric rings. At each time the equations resulting from the equilibrium of strengths and compatibility of displacements are resolved

  13. Ethical Code Effectiveness in Football Clubs: A Longitudinal Analysis

    OpenAIRE

    Constandt, Bram; De Waegeneer, Els; Willem, Annick

    2017-01-01

    As football (soccer) clubs are facing different ethical challenges, many clubs are turning to ethical codes to counteract unethical behaviour. However, both in- and outside the sport field, uncertainty remains about the effectiveness of these ethical codes. For the first time, a longitudinal study design was adopted to evaluate code effectiveness. Specifically, a sample of non-professional football clubs formed the subject of our inquiry. Ethical code effectiveness was...

  14. Capital Structure around the World: The Roles of Firm- and Country-Specific Determinants

    NARCIS (Netherlands)

    A. de Jong (Abe); R. Kabir (Rezaul); T.T. Nguyen (Thuy Thu)

    2007-01-01

    textabstractWe analyze the importance of firm-specific and country-specific factors in the leverage choice of firms from 42 countries around the world. Our analysis yields two new results. First, we find that firm-specific determinants of leverage differ across countries, while prior studies

  15. Determination and Application of Comprehensive Specific Frictional Resistance in Heating Engineering

    Directory of Open Access Journals (Sweden)

    Yanan Tian

    2018-01-01

    Full Text Available In this study, we analyze the deficiencies of specific frictional resistance in heating engineering. Based on economic specific frictional resistance, we put forward the concept of comprehensive specific frictional resistance, which considers the multiple factors of technology, economy, regulation modes, pipe segment differences, and medium pressure. Then, we establish a mathematical model of a heating network across its lifespan in order to develop a method for determining the comprehensive specific frictional resistance. Relevant conclusions can be drawn from the results. As an application, we have planned the heating engineering for Yangyuan County in China, which demonstrates the feasibility and superiority of the method.

  16. Tokamak plasma power balance calculation code (TPC code) outline and operation manual

    International Nuclear Information System (INIS)

    Fujieda, Hirobumi; Murakami, Yoshiki; Sugihara, Masayoshi.

    1992-11-01

    This report is a detailed description on the TPC code, that calculates the power balance of a tokamak plasma according to the ITER guidelines. The TPC code works on a personal computer (Macintosh or J-3100/ IBM-PC). Using input data such as the plasma shape, toroidal magnetic field, plasma current, electron temperature, electron density, impurities and heating power, TPC code can determine the operation point of the fusion reactor (Ion temperature is assumed to be equal to the electron temperature). Supplied flux (Volt · sec) and burn time are also estimated by coil design parameters. Calculated energy confinement time is compared with various L-mode scaling laws and the confinement enhancement factor (H-factor) is evaluated. Divertor heat load is predicted by using simple scaling models (constant-χ, Bohm-type-χ and JT-60U empirical scaling models). Frequently used data can be stored in a 'device file' and used as the default values. TPC code can generate 2-D mesh data and the POPCON plot is drawn by a contour line plotting program (CONPLT). The operation manual about CONPLT code is also described. (author)

  17. Implatation of MC2 computer code

    International Nuclear Information System (INIS)

    Seehusen, J.; Nair, R.P.K.; Becceneri, J.C.

    1981-01-01

    The implantation of MC2 computer code in the CDC system is presented. The MC2 computer code calculates multigroup cross sections for tipical compositions of fast reactors. The multigroup constants are calculated using solutions of PI or BI approximations for determined buckling value as weighting function. (M.C.K.) [pt

  18. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  19. Methodology, status and plans for development and assessment of TUF and CATHENA codes

    Energy Technology Data Exchange (ETDEWEB)

    Luxat, J.C.; Liu, W.S.; Leung, R.K. [Ontario Hydro, Toronto (Canada)] [and others

    1997-07-01

    An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically on CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.

  20. Methodology, status and plans for development and assessment of TUF and CATHENA codes

    International Nuclear Information System (INIS)

    Luxat, J.C.; Liu, W.S.; Leung, R.K.

    1997-01-01

    An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically on CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area

  1. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  2. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  3. A DOE Computer Code Toolbox: Issues and Opportunities

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2001-01-01

    The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications

  4. Capital structure around the world: The roles of firm- and country-specific determinants

    NARCIS (Netherlands)

    de Jong, Abe; Kabir, Mohammed Rezaul; Nguyen, Thuy Thu

    2008-01-01

    We analyze the importance of firm-specific and country-specific factors in the leverage choice of firms from 42 countries around the world. Our analysis yields two new results. First, we find that firm-specific determinants of leverage differ across countries, while prior studies implicitly assume

  5. Promoter Analysis Reveals Globally Differential Regulation of Human Long Non-Coding RNA and Protein-Coding Genes

    KAUST Repository

    Alam, Tanvir

    2014-10-02

    Transcriptional regulation of protein-coding genes is increasingly well-understood on a global scale, yet no comparable information exists for long non-coding RNA (lncRNA) genes, which were recently recognized to be as numerous as protein-coding genes in mammalian genomes. We performed a genome-wide comparative analysis of the promoters of human lncRNA and protein-coding genes, finding global differences in specific genetic and epigenetic features relevant to transcriptional regulation. These two groups of genes are hence subject to separate transcriptional regulatory programs, including distinct transcription factor (TF) proteins that significantly favor lncRNA, rather than coding-gene, promoters. We report a specific signature of promoter-proximal transcriptional regulation of lncRNA genes, including several distinct transcription factor binding sites (TFBS). Experimental DNase I hypersensitive site profiles are consistent with active configurations of these lncRNA TFBS sets in diverse human cell types. TFBS ChIP-seq datasets confirm the binding events that we predicted using computational approaches for a subset of factors. For several TFs known to be directly regulated by lncRNAs, we find that their putative TFBSs are enriched at lncRNA promoters, suggesting that the TFs and the lncRNAs may participate in a bidirectional feedback loop regulatory network. Accordingly, cells may be able to modulate lncRNA expression levels independently of mRNA levels via distinct regulatory pathways. Our results also raise the possibility that, given the historical reliance on protein-coding gene catalogs to define the chromatin states of active promoters, a revision of these chromatin signature profiles to incorporate expressed lncRNA genes is warranted in the future.

  6. The use of best estimate codes to improve the simulation in real time

    International Nuclear Information System (INIS)

    Rivero, N.; Esteban, J. A.; Lenhardt, G.

    2007-01-01

    Best estimate codes are assumed to be the technology solution providing the most realistic and accurate response. Best estimate technology provides a complementary solution to the conservative simulation technology usually applied to determine plant safety margins and perform security related studies. Tecnatom in the early 90's, within the MAS project, pioneered the initiative to implement best estimate code in its training simulators. Result of this project was the implementation of the first six-equations thermal hydraulic code worldwide (TRAC R T), running in a training environment. To meet real time and other specific training requirements, it was necessary to overcome important difficulties. Tecnatom has just adapted the Global Nuclear Fuel core Design code: PANAC 11, and is about to complete the General Electric TRACG04 thermal hydraulic code adaptation. This technology features a unique solution for nuclear plants aiming at providing the highest fidelity in simulation, enabling to consider the simulator as a multipurpose: engineering and training, simulation platform. Besides, a visual environment designed to optimize the models life cycle, covering both pre and post-processing activities, is in its late development phase. (Author)

  7. Towards a universal code formatter through machine learning

    NARCIS (Netherlands)

    Parr, T. (Terence); J.J. Vinju (Jurgen)

    2016-01-01

    textabstractThere are many declarative frameworks that allow us to implement code formatters relatively easily for any specific language, but constructing them is cumbersome. The first problem is that "everybody" wants to format their code differently, leading to either many formatter variants or a

  8. OM Code Requirements For MOVs -- OMN-1 and Appendix III

    Energy Technology Data Exchange (ETDEWEB)

    Kevin G. DeWall

    2011-08-01

    The purpose or scope of the ASME OM Code is to establish the requirements for pre-service and in-service testing of nuclear power plant components to assess their operational readiness. For MOVs this includes those that perform a specific function in shutting down a reactor to the safe shutdown condition, maintaining the safe shutdown condition, and mitigating the consequences of an accident. This paper will present a brief history of industry and regulatory activities related to MOVs and the development of Code requirements to address weaknesses in earlier versions of the OM Code. The paper will discuss the MOV requirements contained in the 2009 version of ASME OM Code, specifically Mandatory Appendix III and OMN-1, Revision 1.

  9. OM Code Requirements For MOVs -- OMN-1 and Appendix III

    International Nuclear Information System (INIS)

    DeWall, Kevin G.

    2011-01-01

    The purpose or scope of the ASME OM Code is to establish the requirements for pre-service and in-service testing of nuclear power plant components to assess their operational readiness. For MOVs this includes those that perform a specific function in shutting down a reactor to the safe shutdown condition, maintaining the safe shutdown condition, and mitigating the consequences of an accident. This paper will present a brief history of industry and regulatory activities related to MOVs and the development of Code requirements to address weaknesses in earlier versions of the OM Code. The paper will discuss the MOV requirements contained in the 2009 version of ASME OM Code, specifically Mandatory Appendix III and OMN-1, Revision 1.

  10. Implications of Sepedi/English code switching for ASR systems

    CSIR Research Space (South Africa)

    Modipa, TI

    2013-12-01

    Full Text Available . We also perform an initial acoustic analysis to determine the impact of such code switching on speech recognition performance. We nd that the frequency of code switching is unexpectedly high, and that the continuum of code switching (from unmodi ed...

  11. A novel neutron energy spectrum unfolding code using particle swarm optimization

    International Nuclear Information System (INIS)

    Shahabinejad, H.; Sohrabpour, M.

    2017-01-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code. - Highlights: • Introducing a novel method for neutron spectrum unfolding. • Implementation of a particle swarm optimization code for neutron unfolding. • Comparing results of the PSO code with those of recently published TGASU code. • Match results of the PSO code with those of TGASU code. • Greater convergence rate of implemented PSO code than TGASU code.

  12. Specificity determinants for the abscisic acid response element.

    Science.gov (United States)

    Sarkar, Aditya Kumar; Lahiri, Ansuman

    2013-01-01

    Abscisic acid (ABA) response elements (ABREs) are a group of cis-acting DNA elements that have been identified from promoter analysis of many ABA-regulated genes in plants. We are interested in understanding the mechanism of binding specificity between ABREs and a class of bZIP transcription factors known as ABRE binding factors (ABFs). In this work, we have modeled the homodimeric structure of the bZIP domain of ABRE binding factor 1 from Arabidopsis thaliana (AtABF1) and studied its interaction with ACGT core motif-containing ABRE sequences. We have also examined the variation in the stability of the protein-DNA complex upon mutating ABRE sequences using the protein design algorithm FoldX. The high throughput free energy calculations successfully predicted the ability of ABF1 to bind to alternative core motifs like GCGT or AAGT and also rationalized the role of the flanking sequences in determining the specificity of the protein-DNA interaction.

  13. SCAMPI: A code package for cross-section processing

    International Nuclear Information System (INIS)

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-01-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis

  14. SCAMPI: A code package for cross-section processing

    Energy Technology Data Exchange (ETDEWEB)

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-04-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis.

  15. Variable code gamma ray imaging system

    International Nuclear Information System (INIS)

    Macovski, A.; Rosenfeld, D.

    1979-01-01

    A gamma-ray source distribution in the body is imaged onto a detector using an array of apertures. The transmission of each aperture is modulated using a code such that the individual views of the source through each aperture can be decoded and separated. The codes are chosen to maximize the signal to noise ratio for each source distribution. These codes determine the photon collection efficiency of the aperture array. Planar arrays are used for volumetric reconstructions and circular arrays for cross-sectional reconstructions. 14 claims

  16. The commerce of professional psychology and the new ethics code.

    Science.gov (United States)

    Koocher, G P

    1994-11-01

    The 1992 version of the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct brings some changes in requirements and new specificity to the practice of psychology. The impact of the new code on therapeutic contracts, informed consent to psychological services, advertising, financial aspects of psychological practice, and other topics related to the commerce of professional psychology are discussed. The genesis of many new thrusts in the code is reviewed from the perspective of psychological service provider. Specific recommendations for improved attention to ethical matters in professional practice are made.

  17. SEVERAL OBSERVATIONS REGARDING THE REGULATION OF THE CONTRACT OF PARTNERSHIP IN THE NEW CIVIL CODE

    Directory of Open Access Journals (Sweden)

    IOLANDA-ELENA CADARIU-LUNGU

    2012-05-01

    Full Text Available Following the model of the Italian Civil Code, of the Civil Code from Quebec, the Swiss and the Dutch ones, the new Romanian Civil Code has adopted the monist conception of regulating the private law relationships, gathering in the same normative act traditional civil law dispositions as well as dispositions that are specific to the commercial relationships among professionals. In this regulating context, one of the fundamental changes the new Civil Code brings is the unification of the legal regime applicable to civil and commercial contracts, with all the consequences that derive from this new legislative approach. This fundamental modification is first determined by the profound change of the character of social, economic and juridical relationships, by the change of the cultural level of the Romanian society, by the closeness of the two branches of civil and commercial law and, last but not least, by the evolution of the business environment. In this line of thought, we can identify important changes in the matter of the contract of partnership which, as regulated by the new Civil Code, constitutes the common law both for the simple partnerships (former civil societies as well as for the commercial companies, to which the special legislation still in force in the matter still applies. In this study we aimed at analyzing the general common features of all associative forms listed by art. 1.888 Civil Code and the new elements in the matter, with critical observations where needed, which take the form of a comparison with the specific legislation in the field from the Civil Codes that served as a source of inspiration for the Romanian legislator.

  18. 29 CFR 4.5 - Contract specification of determined minimum wages and fringe benefits.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Contract specification of determined minimum wages and... of determined minimum wages and fringe benefits. (a) Any contract in excess of $2,500 shall contain, as an attachment, the applicable, currently effective wage determination specifying the minimum wages...

  19. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  20. Coded communications with nonideal interleaving

    Science.gov (United States)

    Laufer, Shaul

    1991-02-01

    Burst error channels - a type of block interference channels - feature increasing capacity but decreasing cutoff rate as the memory rate increases. Despite the large capacity, there is degradation in the performance of practical coding schemes when the memory length is excessive. A short-coding error parameter (SCEP) was introduced, which expresses a bound on the average decoding-error probability for codes shorter than the block interference length. The performance of a coded slow frequency-hopping communication channel is analyzed for worst-case partial band jamming and nonideal interleaving, by deriving expressions for the capacity and cutoff rate. The capacity and cutoff rate, respectively, are shown to approach and depart from those of a memoryless channel corresponding to the transmission of a single code letter per hop. For multiaccess communications over a slot-synchronized collision channel without feedback, the channel was considered as a block interference channel with memory length equal to the number of letters transmitted in each slot. The effects of an asymmetrical background noise and a reduced collision error rate were studied, as aspects of real communications. The performance of specific convolutional and Reed-Solomon codes was examined for slow frequency-hopping systems with nonideal interleaving. An upper bound is presented for the performance of a Viterbi decoder for a convolutional code with nonideal interleaving, and a soft decision diversity combining technique is introduced.

  1. Broadcasting a Common Message with Variable-Length Stop-Feedback codes

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Yang, Wei; Durisi, Giuseppe

    2015-01-01

    We investigate the maximum coding rate achievable over a two-user broadcast channel for the scenario where a common message is transmitted using variable-length stop-feedback codes. Specifically, upon decoding the common message, each decoder sends a stop signal to the encoder, which transmits...... itself in the absence of a square-root penalty in the asymptotic expansion of the maximum coding rate for large blocklengths, a result also known as zero dispersion. In this paper, we show that this speed-up does not necessarily occur for the broadcast channel with common message. Specifically...... continuously until it receives both stop signals. For the point-to-point case, Polyanskiy, Poor, and Verdú (2011) recently demonstrated that variable-length coding combined with stop feedback significantly increases the speed at which the maximum coding rate converges to capacity. This speed-up manifests...

  2. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  3. Infinity-Norm Permutation Covering Codes from Cyclic Groups

    OpenAIRE

    Karni, Ronen; Schwartz, Moshe

    2017-01-01

    We study covering codes of permutations with the $\\ell_\\infty$-metric. We provide a general code construction, which uses smaller building-block codes. We study cyclic transitive groups as building blocks, determining their exact covering radius, and showing linear-time algorithms for finding a covering codeword. We also bound the covering radius of relabeled cyclic transitive groups under conjugation.

  4. Balanced and sparse Tamo-Barg codes

    KAUST Repository

    Halbawi, Wael; Duursma, Iwan; Dau, Hoang; Hassibi, Babak

    2017-01-01

    We construct balanced and sparse generator matrices for Tamo and Barg's Locally Recoverable Codes (LRCs). More specifically, for a cyclic Tamo-Barg code of length n, dimension k and locality r, we show how to deterministically construct a generator matrix where the number of nonzeros in any two columns differs by at most one, and where the weight of every row is d + r - 1, where d is the minimum distance of the code. Since LRCs are designed mainly for distributed storage systems, the results presented in this work provide a computationally balanced and efficient encoding scheme for these codes. The balanced property ensures that the computational effort exerted by any storage node is essentially the same, whilst the sparse property ensures that this effort is minimal. The work presented in this paper extends a similar result previously established for Reed-Solomon (RS) codes, where it is now known that any cyclic RS code possesses a generator matrix that is balanced as described, but is sparsest, meaning that each row has d nonzeros.

  5. Balanced and sparse Tamo-Barg codes

    KAUST Repository

    Halbawi, Wael

    2017-08-29

    We construct balanced and sparse generator matrices for Tamo and Barg\\'s Locally Recoverable Codes (LRCs). More specifically, for a cyclic Tamo-Barg code of length n, dimension k and locality r, we show how to deterministically construct a generator matrix where the number of nonzeros in any two columns differs by at most one, and where the weight of every row is d + r - 1, where d is the minimum distance of the code. Since LRCs are designed mainly for distributed storage systems, the results presented in this work provide a computationally balanced and efficient encoding scheme for these codes. The balanced property ensures that the computational effort exerted by any storage node is essentially the same, whilst the sparse property ensures that this effort is minimal. The work presented in this paper extends a similar result previously established for Reed-Solomon (RS) codes, where it is now known that any cyclic RS code possesses a generator matrix that is balanced as described, but is sparsest, meaning that each row has d nonzeros.

  6. Auditing Consistency and Usefulness of LOINC Use among Three Large Institutions - Using Version Spaces for Grouping LOINC Codes

    Science.gov (United States)

    Lin, M.C.; Vreeman, D.J.; Huff, S.M.

    2012-01-01

    Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add

  7. Bar-code automated waste tracking system

    International Nuclear Information System (INIS)

    Hull, T.E.

    1994-10-01

    The Bar-Code Automated Waste Tracking System was designed to be a site-Specific program with a general purpose application for transportability to other facilities. The system is user-friendly, totally automated, and incorporates the use of a drive-up window that is close to the areas dealing in container preparation, delivery, pickup, and disposal. The system features ''stop-and-go'' operation rather than a long, tedious, error-prone manual entry. The system is designed for automation but allows operators to concentrate on proper handling of waste while maintaining manual entry of data as a backup. A large wall plaque filled with bar-code labels is used to input specific details about any movement of waste

  8. IAEA code and safety guides on quality assurance

    International Nuclear Information System (INIS)

    Raisic, N.

    1980-01-01

    In the framework of its programme in safety standards development, the IAEA has recently published a Code of Practice on Quality Assurance for Safety in Nuclear Power Plants. The Code establishes minimum requirements for quality assurance which Member States should use in the context of their own nuclear safety requirements. A series of 10 Safety Guides which describe acceptable methods of implementing the requirements of specific sections of the Code are in preparation. (orig.)

  9. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  10. Local gene regulation details a recognition code within the LacI transcriptional factor family.

    Directory of Open Access Journals (Sweden)

    Francisco M Camas

    2010-11-01

    Full Text Available The specific binding of regulatory proteins to DNA sequences exhibits no clear patterns of association between amino acids (AAs and nucleotides (NTs. This complexity of protein-DNA interactions raises the question of whether a simple set of wide-coverage recognition rules can ever be identified. Here, we analyzed this issue using the extensive LacI family of transcriptional factors (TFs. We searched for recognition patterns by introducing a new approach to phylogenetic footprinting, based on the pervasive presence of local regulation in prokaryotic transcriptional networks. We identified a set of specificity correlations--determined by two AAs of the TFs and two NTs in the binding sites--that is conserved throughout a dominant subgroup within the family regardless of the evolutionary distance, and that act as a relatively consistent recognition code. The proposed rules are confirmed with data of previous experimental studies and by events of convergent evolution in the phylogenetic tree. The presence of a code emphasizes the stable structural context of the LacI family, while defining a precise blueprint to reprogram TF specificity with many practical applications.

  11. Development and application of the BOA code in Spain

    International Nuclear Information System (INIS)

    Tortuero Lopez, C.; Doncel Gutierrez, N.; Culebras, F.

    2012-01-01

    The BOA code allows to quantitatively establish the level of risk of Axial Offset Anomaly and increased deposition of crud on the basis of specific conditions in each case. For this reason, the code is parameterized according to the individual characteristics of each plant. This paper summarizes the results obtained in the implementation of the code, as well as its future perspective.

  12. Using ion-selective electrode for determining iodine-131 preparation specific activity

    International Nuclear Information System (INIS)

    Melnik, M.I.; Nazirova, T.E.

    2002-01-01

    A pilot facility was developed in 2000 for the production of iodine-131. The parameters of the preparation are as follows: chemical form: sodium iodide solution (NaI-131) in a carbonate-bicarbonate buffer (or in 0.001M NaOH); specific activity: carrier free (> 5 Ci/mg); solution pH: 7-10; radionuclide purity: > 99.9%; radiochemical purity: > 97%; bulk activity: 0.15 Ci/ml. The experimental results of investigation aimed at the determination of the specific activity of the I-131 preparation using a iodine-selective electrode are described. The method enables the analytical concentration of iodide ions in the carbonate-bicarbonate buffer (pH = 8-11) and NaOH solution (0.01 mol/l, pH = 8-11) to be determined. A micro-cell has been developed for the analysis of the I-131 solution allowing the sample volume to be reduced to below 0.3 ml. The relative error of determination of the analytical concentration of iodide (10 -6 to 10 -1 mol/l) does not exceed 1%

  13. Automated searching for quantum subsystem codes

    International Nuclear Information System (INIS)

    Crosswhite, Gregory M.; Bacon, Dave

    2011-01-01

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  14. Code of ethics for dental researchers.

    Science.gov (United States)

    2014-01-01

    The International Association for Dental Research, in 2009, adopted a code of ethics. The code applies to members of the association and is enforceable by sanction, with the stated requirement that members are expected to inform the association in cases where they believe misconduct has occurred. The IADR code goes beyond the Belmont and Helsinki statements by virtue of covering animal research. It also addresses issues of sponsorship of research and conflicts of interest, international collaborative research, duty of researchers to be informed about applicable norms, standards of publication (including plagiarism), and the obligation of "whistleblowing" for the sake of maintaining the integrity of the dental research enterprise as a whole. The code is organized, like the ADA code, into two sections. The IADR principles are stated, but not defined, and number 12, instead of the ADA's five. The second section consists of "best practices," which are specific statements of expected or interdicted activities. The short list of definitions is useful.

  15. Tissue-specific regulation of mouse MicroRNA genes in endoderm-derived tissues

    OpenAIRE

    Gao, Yan; Schug, Jonathan; McKenna, Lindsay B.; Le Lay, John; Kaestner, Klaus H.; Greenbaum, Linda E.

    2010-01-01

    MicroRNAs fine-tune the activity of hundreds of protein-coding genes. The identification of tissue-specific microRNAs and their promoters has been constrained by the limited sensitivity of prior microRNA quantification methods. Here, we determine the entire microRNAome of three endoderm-derived tissues, liver, jejunum and pancreas, using ultra-high throughput sequencing. Although many microRNA genes are expressed at comparable levels, 162 microRNAs exhibited striking tissue-specificity. After...

  16. The use of diagnostic coding in chiropractic practice

    DEFF Research Database (Denmark)

    Testern, Cecilie D; Hestbæk, Lise; French, Simon D

    2015-01-01

    BACKGROUND: Diagnostic coding has several potential benefits, including improving the feasibility of data collection for research and clinical audits and providing a common language to improve interdisciplinary collaboration. The primary aim of this study was to determine the views and perspectives......-2 PLUS) provided the 14 chiropractors with some experience in diagnostic coding, followed by an interview on the topic. The interviews were analysed thematically. The participating chiropractors and an independent coder applied ICPC-2 PLUS terms to the diagnoses of 10 patients. Then the level...... of agreement between the chiropractors and the coder was determined and Cohen's Kappa was used to determine the agreement beyond that expected by chance. RESULTS: From the interviews the three emerging themes were: 1) Advantages and disadvantages of using a clinical coding system in chiropractic practice, 2...

  17. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    Science.gov (United States)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  18. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  19. The intercomparison of aerosol codes

    International Nuclear Information System (INIS)

    Dunbar, I.H.; Fermandjian, J.; Gauvain, J.

    1988-01-01

    The behavior of aerosols in a reactor containment vessel following a severe accident could be an important determinant of the accident source term to the environment. Various processes result in the deposition of the aerosol onto surfaces within the containment, from where they are much less likely to be released. Some of these processes are very sensitive to particle size, so it is important to model the aerosol growth processes: agglomeration and condensation. A number of computer codes have been written to model growth and deposition processes. They have been tested against each other in a series of code comparison exercises. These exercises have investigated sensitivities to physical and numerical assumptions and have also proved a useful means of quality control for the codes. Various exercises in which code predictions are compared with experimental results are now under way

  20. Absorbed dose determination in external beam radiotherapy. An international code of practice for dosimetry based on standards of absorbed dose to water

    International Nuclear Information System (INIS)

    2000-01-01

    The International Atomic Energy Agency published in 1987 an International Code of Practice entitled 'Absorbed Dose Determination in Photon and Electron Beams' (IAEA Technical Reports Series No. 277 (TRS-277)), recommending procedures to obtain the absorbed dose in water from measurements made with an ionization chamber in external beam radiotherapy. A second edition of TRS-277 was published in 1997 updating the dosimetry of photon beams, mainly kilovoltage X rays. Another International Code of Practice for radiotherapy dosimetry entitled 'The Use of Plane-Parallel Ionization Chambers in High Energy Electron and Photon Beams' (IAEA Technical Reports Series No. 381 (TRS-381)) was published in 1997 to further update TRS-277 and complement it with respect to the area of parallel-plate ionization chambers. Both codes have proven extremely valuable for users involved in the dosimetry of the radiation beams used in radiotherapy. In TRS-277 the calibration of the ionization chambers was based on primary standards of air kerma; this procedure was also used in TRS-381, but the new trend of calibrating ionization chambers directly in a water phantom in terms of absorbed dose to water was introduced. The development of primary standards of absorbed dose to water for high energy photon and electron beams, and improvements in radiation dosimetry concepts, offer the possibility of reducing the uncertainty in the dosimetry of radiotherapy beams. The dosimetry of kilovoltage X rays, as well as that of proton and heavy ion beams, interest in which has grown considerably in recent years, can also be based on these standards. Thus a coherent dosimetry system based on standards of absorbed dose to water is possible for practically all radiotherapy beams. Many Primary Standard Dosimetry Laboratories (PSDLs) already provide calibrations in terms of absorbed dose to water at the radiation quality of 60 Co gamma rays. Some laboratories have extended calibrations to high energy photon and

  1. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  2. Relationship between various pressure vessel and piping codes

    International Nuclear Information System (INIS)

    Canonico, D.A.

    1976-01-01

    Section VIII of the ASME Code provides stress allowable values for material specifications that are provided in Section II Parts A and B. Since the adoption of the ASME Code over 60 years ago the incidence of failure has been greatly reduced. The Codes are currently based on strength criteria and advancements in the technology of fracture toughness and fracture mechanics should permit an even greater degree of reliability and safety. This lecture discusses the various Sections of the Code. It describes the basis for the establishment of design stress allowables and promotes the idea of the use of fracture mechanics

  3. Potential of the MCNP computer code

    International Nuclear Information System (INIS)

    Kyncl, J.

    1995-01-01

    The MCNP code is designed for numerical solution of neutron, photon, and electron transport problems by the Monte Carlo method. The code is based on the linear transport theory of behavior of the differential flux of the particles. The code directly uses data from the cross section point data library for input. Experience is outlined, gained in the application of the code to the calculation of the effective parameters of fuel assemblies and of the entire reactor core, to the determination of the effective parameters of the elementary fuel cell, and to the numerical solution of neutron diffusion and/or transport problems of the fuel assembly. The agreement between the calculated and observed data gives evidence that the MCNP code can be used with advantage for calculations involving WWER type fuel assemblies. (J.B.). 4 figs., 6 refs

  4. User Instructions for the CiderF Individual Dose Code and Associated Utility Codes

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-08-30

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. The RATCHET code modeled movement of 131I in the atmosphere (Ramsdell Jr. et al. 1994). The DECARTES code modeled accumulation of 131I in environmental media (Miley et al. 1994). The CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. This document provides user instructions for computer codes calculating doses to members of the public from atmospheric 131I that have two major differences from the

  5. Time-varying block codes for synchronisation errors: maximum a posteriori decoder and practical issues

    Directory of Open Access Journals (Sweden)

    Johann A. Briffa

    2014-06-01

    Full Text Available In this study, the authors consider time-varying block (TVB codes, which generalise a number of previous synchronisation error-correcting codes. They also consider various practical issues related to maximum a posteriori (MAP decoding of these codes. Specifically, they give an expression for the expected distribution of drift between transmitter and receiver because of synchronisation errors. They determine an appropriate choice for state space limits based on the drift probability distribution. In turn, they obtain an expression for the decoder complexity under given channel conditions in terms of the state space limits used. For a given state space, they also give a number of optimisations that reduce the algorithm complexity with no further loss of decoder performance. They also show how the MAP decoder can be used in the absence of known frame boundaries, and demonstrate that an appropriate choice of decoder parameters allows the decoder to approach the performance when frame boundaries are known, at the expense of some increase in complexity. Finally, they express some existing constructions as TVB codes, comparing performance with published results and showing that improved performance is possible by taking advantage of the flexibility of TVB codes.

  6. The Role of Code-Switching in Bilingual Creativity

    Science.gov (United States)

    Kharkhurin, Anatoliy V.; Wei, Li

    2015-01-01

    This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from…

  7. NADAC and MERGE: computer codes for processing neutron activation analysis data

    International Nuclear Information System (INIS)

    Heft, R.E.; Martin, W.E.

    1977-01-01

    Absolute disintegration rates of specific radioactive products induced by neutron irradition of a sample are determined by spectrometric analysis of gamma-ray emissions. Nuclide identification and quantification is carried out by a complex computer code GAMANAL (described elsewhere). The output of GAMANAL is processed by NADAC, a computer code that converts the data on observed distintegration rates to data on the elemental composition of the original sample. Computations by NADAC are on an absolute basis in that stored nuclear parameters are used rather than the difference between the observed disintegration rate and the rate obtained by concurrent irradiation of elemental standards. The NADAC code provides for the computation of complex cases including those involving interrupted irradiations, parent and daughter decay situations where the daughter may also be produced independently, nuclides with very short half-lives compared to counting interval, and those involving interference by competing neutron-induced reactions. The NADAC output consists of a printed report, which summarizes analytical results, and a card-image file, which can be used as input to another computer code MERGE. The purpose of MERGE is to combine the results of multiple analyses and produce a single final answer, based on all available information, for each element found

  8. Hierarchy, determinism, and specificity in theories of development and evolution.

    Science.gov (United States)

    Deichmann, Ute

    2017-10-16

    The concepts of hierarchical organization, genetic determinism and biological specificity (for example of species, biologically relevant macromolecules, or genes) have played a crucial role in biology as a modern experimental science since its beginnings in the nineteenth century. The idea of genetic information (specificity) and genetic determination was at the basis of molecular biology that developed in the 1940s with macromolecules, viruses and prokaryotes as major objects of research often labelled "reductionist". However, the concepts have been marginalized or rejected in some of the research that in the late 1960s began to focus additionally on the molecularization of complex biological structures and functions using systems approaches. This paper challenges the view that 'molecular reductionism' has been successfully replaced by holism and a focus on the collective behaviour of cellular entities. It argues instead that there are more fertile replacements for molecular 'reductionism', in which genomics, embryology, biochemistry, and computer science intertwine and result in research that is as exact and causally predictive as earlier molecular biology.

  9. The ASME Code today -- Challenges, threats, opportunities

    International Nuclear Information System (INIS)

    Canonico, D.A.

    1995-01-01

    Since its modest beginning as a single volume in 1914 the ASME Code, or some of its parts, is recognized today in 48 of the United States and all providence's of Canada. The ASME Code today is composed of 25 books including two Code Case books. These books cover the new construction of boilers and pressure vessels and the new construction and In-Service-Inspection of Nuclear Power Plant components. The ASME accredits all manufacturers of boilers and pressure vessels built to the ASME Code. There are approximately 7650 symbol stamps issued throughout the world. Over 23% of the symbol stamps have been issued outside the USA and Canada. The challenge to the ASME Code is to be accepted as the world standard for pressure boundary components. There are activities underway to achieve that goal. The ASME Code is being revised to make it a more friendly document to entities outside of North America. To achieve that end there are specific tasks underway which are described here

  10. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    Science.gov (United States)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  11. Evaluation of Code Blue Implementation Outcomes

    Directory of Open Access Journals (Sweden)

    Bengü Özütürk

    2015-09-01

    Full Text Available Aim: In this study, we aimed to emphasize the importance of Code Blue implementation and to determine deficiencies in this regard. Methods: After obtaining the ethics committee approval, 225 patient’s code blue call data between 2012 and 2014 January were retrospectively analyzed. Age and gender of the patients, date and time of the call and the clinics giving Code Blue, the time needed for the Code Blue team to arrive, the rates of false Code Blue calls, reasons for Code Blue calls and patient outcomes were investigated. Results: A total of 225 patients (149 male, 76 female were evaluated in the study. The mean age of the patients was 54.1 years. 142 (67.2% Code Blue calls occurred after hours and by emergency unit. The mean time for the Code Blue team to arrive was 1.10 minutes. Spontaneous circulation was provided in 137 patients (60.8%; 88 (39.1% died. The most commonly identified possible causes were of cardiac origin. Conclusion: This study showed that Code Blue implementation with a professional team within an efficient and targeted time increase the survival rate. Therefore, we conclude that the application of Code Blue carried out by a trained team is an essential standard in hospitals. (The Medical Bulletin of Haseki 2015; 53:204-8

  12. The MELTSPREAD Code for Modeling of Ex-Vessel Core Debris Spreading Behavior, Code Manual – Version3-beta

    Energy Technology Data Exchange (ETDEWEB)

    Farmer, M. T. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-09-01

    MELTSPREAD3 is a transient one-dimensional computer code that has been developed to predict the gravity-driven flow and freezing behavior of molten reactor core materials (corium) in containment geometries. Predictions can be made for corium flowing across surfaces under either dry or wet cavity conditions. The spreading surfaces that can be selected are steel, concrete, a user-specified material (e.g., a ceramic), or an arbitrary combination thereof. The corium can have a wide range of compositions of reactor core materials that includes distinct oxide phases (predominantly Zr, and steel oxides) plus metallic phases (predominantly Zr and steel). The code requires input that describes the containment geometry, melt “pour” conditions, and cavity atmospheric conditions (i.e., pressure, temperature, and cavity flooding information). For cases in which the cavity contains a preexisting water layer at the time of RPV failure, melt jet breakup and particle bed formation can be calculated mechanistically given the time-dependent melt pour conditions (input data) as well as the heatup and boiloff of water in the melt impingement zone (calculated). For core debris impacting either the containment floor or previously spread material, the code calculates the transient hydrodynamics and heat transfer which determine the spreading and freezing behavior of the melt. The code predicts conditions at the end of the spreading stage, including melt relocation distance, depth and material composition profiles, substrate ablation profile, and wall heatup. Code output can be used as input to other models such as CORQUENCH that evaluate long term core-concrete interaction behavior following the transient spreading stage. MELTSPREAD3 was originally developed to investigate BWR Mark I liner vulnerability, but has been substantially upgraded and applied to other reactor designs (e.g., the EPR), and more recently to the plant accidents at Fukushima Daiichi. The most recent round of

  13. Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior.

    Science.gov (United States)

    Panzeri, Stefano; Harvey, Christopher D; Piasini, Eugenio; Latham, Peter E; Fellin, Tommaso

    2017-02-08

    The two basic processes underlying perceptual decisions-how neural responses encode stimuli, and how they inform behavioral choices-have mainly been studied separately. Thus, although many spatiotemporal features of neural population activity, or "neural codes," have been shown to carry sensory information, it is often unknown whether the brain uses these features for perception. To address this issue, we propose a new framework centered on redefining the neural code as the neural features that carry sensory information used by the animal to drive appropriate behavior; that is, the features that have an intersection between sensory and choice information. We show how this framework leads to a new statistical analysis of neural activity recorded during behavior that can identify such neural codes, and we discuss how to combine intersection-based analysis of neural recordings with intervention on neural activity to determine definitively whether specific neural activity features are involved in a task. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Consideration of the Construction Code for TBM-body in ASME BPVC

    International Nuclear Information System (INIS)

    Kim, Dongjun; Kim, Yunjae; Kim, Suk Kwon; Park, Sung Dae; Lee, Dong Won

    2016-01-01

    In this paper, ASME code is briefly introduced, and the TBM-body is classified for selecting the ASME section. With the classification of TBM-body, the appropriate section is determined. Helium Cooled Ceramic Reflector (HCCR) Test Blanket System (TBS) has been designed to research on the functions of breeding blanket by KO TBM team. The functions has three subjects as 1) Tritium breeding, 2) Heat conversion and extraction, and 3) Neutron and Gamma-ray shielding. For the process of design, it is needed to select the appropriate construction code as the design criteria. ITER Organization (IO) has proposed that RCC-MR Edition 2007 ver. shall be used for TBM-shield. Because the TBM-shield is connected to the vacuum boundary. For the other part of TBM-set, TBM-body, there is no constraint on the selected code, and the manufacturer can appropriately select the construction code to apply design and fabrication parts. KO TBM Team has considered whether it is appropriate to choose any code for TBM-body. One of the things is ASME code. The advantage of ASME choice is suitable to the domestic status. In the domestic nuclear plant, ASME or KEPIC code is used as regulatory requirements. Based on this, it is possible to prepare a domestic fusion plant regulatory. In this paper, the construction code of TBM-body was determined in ASME BPVC. For the determination of code, the structure of ASME BPVC was introduced and the classification for TBM-body was conducted by the ITER criteria. And the operation conditions of TBM-body that contained creep and irradiation effects was considered to determine the construction code

  15. Consideration of the Construction Code for TBM-body in ASME BPVC

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dongjun; Kim, Yunjae [Korea Univ., Seoul (Korea, Republic of); Kim, Suk Kwon; Park, Sung Dae; Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this paper, ASME code is briefly introduced, and the TBM-body is classified for selecting the ASME section. With the classification of TBM-body, the appropriate section is determined. Helium Cooled Ceramic Reflector (HCCR) Test Blanket System (TBS) has been designed to research on the functions of breeding blanket by KO TBM team. The functions has three subjects as 1) Tritium breeding, 2) Heat conversion and extraction, and 3) Neutron and Gamma-ray shielding. For the process of design, it is needed to select the appropriate construction code as the design criteria. ITER Organization (IO) has proposed that RCC-MR Edition 2007 ver. shall be used for TBM-shield. Because the TBM-shield is connected to the vacuum boundary. For the other part of TBM-set, TBM-body, there is no constraint on the selected code, and the manufacturer can appropriately select the construction code to apply design and fabrication parts. KO TBM Team has considered whether it is appropriate to choose any code for TBM-body. One of the things is ASME code. The advantage of ASME choice is suitable to the domestic status. In the domestic nuclear plant, ASME or KEPIC code is used as regulatory requirements. Based on this, it is possible to prepare a domestic fusion plant regulatory. In this paper, the construction code of TBM-body was determined in ASME BPVC. For the determination of code, the structure of ASME BPVC was introduced and the classification for TBM-body was conducted by the ITER criteria. And the operation conditions of TBM-body that contained creep and irradiation effects was considered to determine the construction code.

  16. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  17. Calculation code evaluating the confinement of a nuclear facility in case of fires

    International Nuclear Information System (INIS)

    Laborde, J.C.; Prevost, C.; Vendel, J.

    1995-01-01

    Accident events involving fire are quite frequent and could have a severe effect on the safety of nuclear facilities. As confinement must be maintained, the ventilation and filtration systems have to be designed to limit radioactive release to the environment. To determine and analyse the consequences of a fire on the contamination confinement, IPSN, COGEMA and SGN are participating in development of a calculation code based on introduction, in the SIMEVENT ventilation code, of various models associated to fire risk and mass transfer in the ventilation networks. This calculation code results from the coupling of the SIMEVENT code with several models describing the temperature in a room resulting of a fire, the temperatures along the ventilation ducts, the contamination transfers through out the ventilation equipments (ducts, dampers, valves, air cleaning systems) and the High Efficiency Particulate Air (HEPA) filters clogging. The paper proposed presents the current level of progress in development of this calculation code. It describes, in particular, the empirical model used for the clogging of HEPA filters by the aerosols derived from the combustion of standard materials used in the nuclear industry. It describes, also, the specific models used to take into account the mass transfers and resulting from the basic mechanisms of aerosols physics. In addition, an assessment of this code is given using the example of a simple laboratory installation

  18. Calculation code evaluating the confinement of a nuclear facility in case of fires

    Energy Technology Data Exchange (ETDEWEB)

    Laborde, J.C.; Prevost, C.; Vendel, J. [and others

    1995-02-01

    Accident events involving fire are quite frequent and could have a severe effect on the safety of nuclear facilities. As confinement must be maintained, the ventilation and filtration systems have to be designed to limit radioactive release to the environment. To determine and analyse the consequences of a fire on the contamination confinement, IPSN, COGEMA and SGN are participating in development of a calculation code based on introduction, in the SIMEVENT ventilation code, of various models associated to fire risk and mass transfer in the ventilation networks. This calculation code results from the coupling of the SIMEVENT code with several models describing the temperature in a room resulting of a fire, the temperatures along the ventilation ducts, the contamination transfers through out the ventilation equipments (ducts, dampers, valves, air cleaning systems) and the High Efficiency Particulate Air (HEPA) filters clogging. The paper proposed presents the current level of progress in development of this calculation code. It describes, in particular, the empirical model used for the clogging of HEPA filters by the aerosols derived from the combustion of standard materials used in the nuclear industry. It describes, also, the specific models used to take into account the mass transfers and resulting from the basic mechanisms of aerosols physics. In addition, an assessment of this code is given using the example of a simple laboratory installation.

  19. Essential Specification Elements for Heat Exchanger Replacement

    Energy Technology Data Exchange (ETDEWEB)

    Bower, L.

    2015-07-01

    Performance upgrade and equipment degradation are the primary impetuses for a nuclear power plant to engage in the large capital cost project of heat exchanger replacement. Along with attention to these issues, consideration of heat exchanger Codes and Standards, material improvements, thermal redesign, and configuration are essential for developing User’s Design Specifications for successful replacement projects. The User’s Design Specification is the central document in procuring ASME heat exchangers. Properly stated objectives for the heat exchanger replacement are essential for obtaining the materials, configurations and thermal designs best suited for the nuclear power plant. Additionally, the code of construction required and the applied manufacturing standard (TEMA or HEI) affects how the heat exchanger may be designed or configured to meet the replacement goals. Knowledge of how Codes and Standards affect design and configuration details will aid in writing the User’s Design Specification. Joseph Oat Corporation has designed and fabricated many replacement heat exchangers for the nuclear power industry. These heat exchangers have been constructed per ASME Section III to various Code-Years or ASME Section VIII-1 to the current Code-Year also in accordance with TEMA and HEI. These heat exchangers have been a range of like-for-like replacement to complete thermal, material and configuration redesigns. Several examples of these heat exchangers with their Code, Standard and specification implications are presented. (Author.

  20. ORNL probabilistic fracture-mechanics code OCA-P

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    1984-01-01

    The computer code OCA-P was developed at the request of the USNRC for the purpose of helping to evaluate the integrity of PWR pressure vessels during overcooling accidents (OCA's). The code can be used for both deterministic and probabilistic fracture-mechanics calculations, and consists essentially of OCA-II and a Monte Carlo routine similar to that developed by Strosnider et al. In the probabilistic mode OCA-P generates a large number of vessels (10 6 more or less), each with a different combination of the various values of the different parameters involved in the analysis of flaw behavior. For each of these vessels a deterministic fracture-mechanics analysis is performed (calculation of K/sub I/, K/sub Ic/, K/sub Ia/) to determine whether vessel failure takes place. The conditional probability of failure is simply the number of vessels that fail divided by the number of vessels generated. OCA-II is used for the deterministic analysis. Basic input to OCA-II includes, among other things, the primry-system pressure transient and the temperature transient for the coolant in the reactor-vessel downcomer. With this and additional information available OCA-II performs a one-dimensional thermal analysis to obtain the temperature distribution in the wall as a function of time and then a one-dimensional linear-elastic stress analysis. OCA-P has been checked against similar codes and is presently being used in the Integrated Pressurized Thermal Shock Program for specific PWR plants

  1. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  2. Determination of the physical parameters of the nuclear subcritical assembly Chicago 9000 of the IPN using the Serpent code

    International Nuclear Information System (INIS)

    Arriaga R, L.; Del Valle G, E.; Gomez T, A. M.

    2013-10-01

    For the Serpent code was developed the three-dimensional model corresponding to the nuclear subcritical assembly (S A) Chicago 9000 of the Escuela Superior de Fisica y Matematicas del Instituto Politecnico Nacional (ESFM-IPN). The model includes: a) the core, formed by 312 aluminum pipes that contain 5 nuclear fuel rods (natural uranium in metallic form), b) the multi-perforated plates where they penetrate the inferior part of each pipe to be able to remain in vertical form, c) water, acting as moderator and reflector, and d) the recipient lodging to the core. The pipes arrangement is hexagonal although the transversal section of the recipient that lodges to the core is circular. The entrance file for the Serpent code was generated with the data provided by the manual of the S A use about the composition and density of the fuel rods and others obtained in direct form of the rods, as the interior and external diameter, mass and height. Of the obtained physical parameters, those more approached to that reported in the manual of the subcritical assembly are the effective multiplication factor and the reproduction factor η. The differences can be because the description of the fuel rods provided by the manual of the S A use do not correspond those that are physically in the S A core. This difference consists on the presence of a circular central channel of 1.245 diameter centimeters in each fuel rod. The fuel rods reported in the mentioned manual do not have that channel. Although the obtained results are encouraging, we want to continue improving the model to incorporate in this the detectors, defined this way by the Serpent code, which could determine the existent neutrons flux in diverse points of interest like the axial or radial aligned points and to compare these with those that are obtained in an experimental way when a generating neutrons source (Pu-Be) is introduced. Added to this effort the cross sections for each unitary cell will be determined, so that

  3. Review of codes, standards, and regulations for natural gas locomotives.

    Science.gov (United States)

    2014-06-01

    This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...

  4. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  5. Accuracy of Administrative Codes for Distinguishing Positive Pressure Ventilation from High-Flow Nasal Cannula.

    Science.gov (United States)

    Good, Ryan J; Leroue, Matthew K; Czaja, Angela S

    2018-06-07

    Noninvasive positive pressure ventilation (NIPPV) is increasingly used in critically ill pediatric patients, despite limited data on safety and efficacy. Administrative data may be a good resource for observational studies. Therefore, we sought to assess the performance of the International Classification of Diseases, Ninth Revision procedure code for NIPPV. Patients admitted to the PICU requiring NIPPV or heated high-flow nasal cannula (HHFNC) over the 11-month study period were identified from the Virtual PICU System database. The gold standard was manual review of the electronic health record to verify the use of NIPPV or HHFNC among the cohort. The presence or absence of a NIPPV procedure code was determined by using administrative data. Test characteristics with 95% confidence intervals (CIs) were generated, comparing administrative data with the gold standard. Among the cohort ( n = 562), the majority were younger than 5 years, and the most common primary diagnosis was bronchiolitis. Most (82%) required NIPPV, whereas 18% required only HHFNC. The NIPPV code had a sensitivity of 91.1% (95% CI: 88.2%-93.6%) and a specificity of 57.6% (95% CI: 47.2%-67.5%), with a positive likelihood ratio of 2.15 (95% CI: 1.70-2.71) and negative likelihood ratio of 0.15 (95% CI: 0.11-0.22). Among our critically ill pediatric cohort, NIPPV procedure codes had high sensitivity but only moderate specificity. On the basis of our study results, there is a risk of misclassification, specifically failure to identify children who require NIPPV, when using administrative data to study the use of NIPPV in this population. Copyright © 2018 by the American Academy of Pediatrics.

  6. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  7. Use of the algebraic coding theory in nuclear electronics

    International Nuclear Information System (INIS)

    Nikityuk, N.M.

    1990-01-01

    New results of studies of the development and use of the syndrome coding method in nuclear electronics are described. Two aspects of using the syndrome coding method are considered for sequential coding devices and for the creation of fast parallel data compression devices. Specific examples of the creation of time-to-digital converters based on circular counters are described. Several time intervals can be coded very fast and with a high resolution by means of these converters. The effective coding matrix which can be used for light signal coding. The rule of constructing such coding matrices for arbitrary number of channels and multiplicity n is given. The methods for solving ambiguities in silicon detectors and for creating the special-purpose processors for high-energy spectrometers are given. 21 refs.; 9 figs.; 3 tabs

  8. Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.

    Science.gov (United States)

    Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic

    2017-03-01

    Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.

  9. Identification of ICD Codes Suggestive of Child Maltreatment

    Science.gov (United States)

    Schnitzer, Patricia G.; Slusher, Paula L.; Kruse, Robin L.; Tarleton, Molly M.

    2011-01-01

    Objective: In order to be reimbursed for the care they provide, hospitals in the United States are required to use a standard system to code all discharge diagnoses: the International Classification of Disease, 9th Revision, Clinical Modification (ICD-9). Although ICD-9 codes specific for child maltreatment exist, they do not identify all…

  10. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    Energy Technology Data Exchange (ETDEWEB)

    Cerjan, Charles J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shi, Xizeng [Read-Rite Corporation, Fremont, CA (United States)

    2017-11-09

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executable code.

  11. Computational design, construction, and characterization of a set of specificity determining residues in protein-protein interactions.

    Science.gov (United States)

    Nagao, Chioko; Izako, Nozomi; Soga, Shinji; Khan, Samia Haseeb; Kawabata, Shigeki; Shirai, Hiroki; Mizuguchi, Kenji

    2012-10-01

    Proteins interact with different partners to perform different functions and it is important to elucidate the determinants of partner specificity in protein complex formation. Although methods for detecting specificity determining positions have been developed previously, direct experimental evidence for these amino acid residues is scarce, and the lack of information has prevented further computational studies. In this article, we constructed a dataset that is likely to exhibit specificity in protein complex formation, based on available crystal structures and several intuitive ideas about interaction profiles and functional subclasses. We then defined a "structure-based specificity determining position (sbSDP)" as a set of equivalent residues in a protein family showing a large variation in their interaction energy with different partners. We investigated sequence and structural features of sbSDPs and demonstrated that their amino acid propensities significantly differed from those of other interacting residues and that the importance of many of these residues for determining specificity had been verified experimentally. Copyright © 2012 Wiley Periodicals, Inc.

  12. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  13. International Code Assessment and Applications Program: Summary of code assessment studies concerning RELAP5/MOD2, RELAP5/MOD3, and TRAC-B

    International Nuclear Information System (INIS)

    Schultz, R.R.

    1993-12-01

    Members of the International Code Assessment Program (ICAP) have assessed the US Nuclear Regulatory Commission (USNRC) advanced thermal-hydraulic codes over the past few years in a concerted effort to identify deficiencies, to define user guidelines, and to determine the state of each code. The results of sixty-two code assessment reviews, conducted at INEL, are summarized. Code deficiencies are discussed and user recommended nodalizations investigated during the course of conducting the assessment studies and reviews are listed. All the work that is summarized was done using the RELAP5/MOD2, RELAP5/MOD3, and TRAC-B codes

  14. RADTRAN: a computer code to analyze transportation of radioactive material

    International Nuclear Information System (INIS)

    Taylor, J.M.; Daniel, S.L.

    1977-04-01

    A computer code is presented which predicts the environmental impact of any specific scheme of radioactive material transportation. Results are presented in terms of annual latent cancer fatalities and annual early fatility probability resulting from exposure, during normal transportation or transport accidents. The code is developed in a generalized format to permit wide application including normal transportation analysis; consideration of alternatives; and detailed consideration of specific sectors of industry

  15. Cyborg lectins: novel leguminous lectins with unique specificities.

    Science.gov (United States)

    Yamamoto, K; Maruyama, I N; Osawa, T

    2000-01-01

    Bauhinia purpurea lectin (BPA) is one of the beta-galactose-binding leguminous lectins. Leguminous lectins contain a long metal-binding loop, part of which determines their carbohydrate-binding specificities. Random mutations were introduced into a portion of the cDNA coding BPA that corresponds to the carbohydrate-binding loop of the lectin. An library of the mutant lectin expressed on the surface of lambda foo phages was screened by the panning method. Several phage clones with an affinity for mannose or N-acetylglucosamine were isolated. These results indicate the possibility of making artificial lectins (so-called "cyborg lectins") with distinct and desired carbohydrate-binding specificities.

  16. Reliability of cause of death coding: an international comparison.

    Science.gov (United States)

    Antini, Carmen; Rajs, Danuta; Muñoz-Quezada, María Teresa; Mondaca, Boris Andrés Lucero; Heiss, Gerardo

    2015-07-01

    This study evaluates the agreement of nosologic coding of cardiovascular causes of death between a Chilean coder and one in the United States, in a stratified random sample of death certificates of persons aged ≥ 60, issued in 2008 in the Valparaíso and Metropolitan regions, Chile. All causes of death were converted to ICD-10 codes in parallel by both coders. Concordance was analyzed with inter-coder agreement and Cohen's kappa coefficient by level of specification ICD-10 code for the underlying cause and the total causes of death coding. Inter-coder agreement was 76.4% for all causes of death and 80.6% for the underlying cause (agreement at the four-digit level), with differences by the level of specification of the ICD-10 code, by line of the death certificate, and by number of causes of death per certificate. Cohen's kappa coefficient was 0.76 (95%CI: 0.68-0.84) for the underlying cause and 0.75 (95%CI: 0.74-0.77) for the total causes of death. In conclusion, causes of death coding and inter-coder agreement for cardiovascular diseases in two regions of Chile are comparable to an external benchmark and with reports from other countries.

  17. Binary codes with impulse autocorrelation functions for dynamic experiments

    International Nuclear Information System (INIS)

    Corran, E.R.; Cummins, J.D.

    1962-09-01

    A series of binary codes exist which have autocorrelation functions approximating to an impulse function. Signals whose behaviour in time can be expressed by such codes have spectra which are 'whiter' over a limited bandwidth and for a finite time than signals from a white noise generator. These codes are used to determine system dynamic responses using the correlation technique. Programmes have been written to compute codes of arbitrary length and to compute 'cyclic' autocorrelation and cross-correlation functions. Complete listings of these programmes are given, and a code of 1019 bits is presented. (author)

  18. Locally decodable codes and private information retrieval schemes

    CERN Document Server

    Yekhanin, Sergey

    2010-01-01

    Locally decodable codes (LDCs) are codes that simultaneously provide efficient random access retrieval and high noise resilience by allowing reliable reconstruction of an arbitrary bit of a message by looking at only a small number of randomly chosen codeword bits. Local decodability comes with a certain loss in terms of efficiency - specifically, locally decodable codes require longer codeword lengths than their classical counterparts. Private information retrieval (PIR) schemes are cryptographic protocols designed to safeguard the privacy of database users. They allow clients to retrieve rec

  19. Evaluation of clinical coding data to determine causes of critical bleeding in patients receiving massive transfusion: a bi-national, multicentre, cross-sectional study.

    Science.gov (United States)

    McQuilten, Z K; Zatta, A J; Andrianopoulos, N; Aoki, N; Stevenson, L; Badami, K G; Bird, R; Cole-Sinclair, M F; Hurn, C; Cameron, P A; Isbister, J P; Phillips, L E; Wood, E M

    2017-04-01

    To evaluate the use of routinely collected data to determine the cause(s) of critical bleeding in patients who receive massive transfusion (MT). Routinely collected data are increasingly being used to describe and evaluate transfusion practice. Chart reviews were undertaken on 10 randomly selected MT patients at 48 hospitals across Australia and New Zealand to determine the cause(s) of critical bleeding. Diagnosis-related group (DRG) and International Classification of Diseases (ICD) codes were extracted separately and used to assign each patient a cause of critical bleeding. These were compared against chart review using percentage agreement and kappa statistics. A total of 427 MT patients were included with complete ICD and DRG data for 427 (100%) and 396 (93%), respectively. Good overall agreement was found between chart review and ICD codes (78·3%; κ = 0·74, 95% CI 0·70-0·79) and only fair overall agreement with DRG (51%; κ = 0·45, 95% CI 0·40-0·50). Both ICD and DRG were sensitive and accurate for classifying obstetric haemorrhage patients (98% sensitivity and κ > 0·94). However, compared with the ICD algorithm, DRGs were less sensitive and accurate in classifying bleeding as a result of gastrointestinal haemorrhage (74% vs 8%; κ = 0·75 vs 0·1), trauma (92% vs 62%; κ = 0·78 vs 0·67), cardiac (80% vs 57%; κ = 0·79 vs 0·60) and vascular surgery (64% vs 56%; κ = 0·69 vs 0·65). Algorithms using ICD codes can determine the cause of critical bleeding in patients requiring MT with good to excellent agreement with clinical history. DRG are less suitable to determine critical bleeding causes. © 2016 British Blood Transfusion Society.

  20. Coupling of 3D neutronics models with the system code ATHLET

    International Nuclear Information System (INIS)

    Langenbuch, S.; Velkov, K.

    1999-01-01

    The system code ATHLET for plant transient and accident analysis has been coupled with 3D neutronics models, like QUABOX/CUBBOX, for the realistic evaluation of some specific safety problems under discussion. The considerations for the coupling approach and its realization are discussed. The specific features of the coupled code system established are explained and experience from first applications is presented. (author)

  1. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  2. Remote-Handled Transuranic Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2006-12-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  3. Specific determination of clinical and toxicological important substances in biological samples by LC-MS

    International Nuclear Information System (INIS)

    Mitulovic, G.

    2001-02-01

    This thesis of this dissertation is the specific determination of clinical and toxicological important substances in biological samples by LC-MS. Nicotine was determined in serum after application of nicotine plaster and nicotine nasal spray with HPLC-ESI-MS. Cotinine was determined direct in urine with HPLC-ESI-MS. Short time anesthetics were determined in blood and cytostatics were determined in liquor with HPLC-ESI-MS. (botek)

  4. The Determination of Neutron-Induced Reaction Cross Section Data on Even-Even, Magic- Number Nuclide Chromium 52 Using EXIFON Code

    International Nuclear Information System (INIS)

    Jonah, S.A.

    2013-01-01

    The EXIFON code version 2.0 is a calculational tool, which is based on both many-body theory and random matrix physics. In this work, it has been used to calculate neutron induced reaction cross section data from 0 to 20 MeV on an even-even, magic number nuclide 52 Cr with neutron number, N=28. Specifically, the (n,p), (n,α) and (n,2n) reaction cross section data were calculated as functions of incident energy of neutrons. Data obtained from the experimental data in the IAEA, EXFOR data Library and recommended data libraries around the globe, JENDL, ENDF and JEFF were used to validate the calculated data. The data indicate that the calculated data without shell corrections are in good agreement with experimental data as well as the recommended data from the evaluated data libraries. The calculated results could provide useful insight into the choice of some input parameters near closed shells using the EXIFON code.

  5. Radionuclide daughter inventory generator code: DIG

    International Nuclear Information System (INIS)

    Fields, D.E.; Sharp, R.D.

    1985-09-01

    The Daughter Inventory Generator (DIG) code accepts a tabulation of radionuclide initially present in a waste stream, specified as amounts present either by mass or by activity, and produces a tabulation of radionuclides present after a user-specified elapsed time. This resultant radionuclide inventory characterizes wastes that have undergone daughter ingrowth during subsequent processes, such as leaching and transport, and includes daughter radionuclides that should be considered in these subsequent processes or for inclusion in a pollutant source term. Output of the DIG code also summarizes radionuclide decay constants. The DIG code was developed specifically to assist the user of the PRESTO-II methodology and code in preparing data sets and accounting for possible daughter ingrowth in wastes buried in shallow-land disposal areas. The DIG code is also useful in preparing data sets for the PRESTO-EPA code. Daughter ingrowth in buried radionuclides and in radionuclides that have been leached from the wastes and are undergoing hydrologic transport are considered, and the quantities of daughter radionuclide are calculated. Radionuclide decay constants generated by DIG and included in the DIG output are required in the PRESTO-II code input data set. The DIG accesses some subroutines written for use with the CRRIS system and accesses files containing radionuclide data compiled by D.C. Kocher. 11 refs

  6. New MDS or near MDS self-dual codes over finite fields

    OpenAIRE

    Tong, Hongxi; Wang, Xiaoqing

    2016-01-01

    The study of MDS self-dual codes has attracted lots of attention in recent years. There are many papers on determining existence of $q-$ary MDS self-dual codes for various lengths. There are not existence of $q-$ary MDS self-dual codes of some lengths, even these lengths $< q$. We generalize MDS Euclidean self-dual codes to near MDS Euclidean self-dual codes and near MDS isodual codes. And we obtain many new near MDS isodual codes from extended negacyclic duadic codes and we obtain many new M...

  7. The determination of specific surface of sodium polyuranates

    International Nuclear Information System (INIS)

    Bilgin, B.; Atun, G.

    2002-01-01

    Three different sodium polyuranates were prepared by titration of uranyl nitrate with a sodium hydroxide solution labeled with 22 Na as the radiotracer. Polyuranates whose composition was *Na 2 O.7,5UO 3 .11H 2 O (sample A), *Na 2 O.4,3 UO 3 .4,7H 2 O (sample B), and *Na 2 O.2UO 3 .4H 2 O (sample C) were precipitated at pH 5.6, 8.5 and 11.2, respectively. The specific surface areas of these samples were determined by the BET method using methylene blue (MB) as the adsorbate. The sodium polyuranate surfaces were saturated by sequential adsorption of MB. The adsorption data gave an S-shaped isotherm and were fitted to the BET equation. The specific surface areas calculated from the BET isotherm decreased in order A > B > C. The isotope and ion exchange reactions between the sodium polyuranates and Li + , Na + , K + , Rb + , Cs + , Ca 2+ , Sr 2+ , and Ba 2+ ions were compared before and after MB coverage. The results showed that the isotope and ion exchange fractions decrease on the covered surfaces indicating particle diffusion mechanism dominated exchange reactions

  8. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  9. Determination of specific activity of americium and plutonium in selected environmental samples

    International Nuclear Information System (INIS)

    Trebunova, T.

    1999-01-01

    The aim of this work was development of method for determination of americium and plutonium in environmental samples. Developed method was evaluated on soil samples and after they was applied on selected samples of fishes (smoked mackerel, herring and fillet from Alaska hake). The method for separation of americium is based on liquid separation with Aliquate-336, precipitation with oxalic acid and using of chromatographic material TRU-Spec TM .The intervals of radiochemical yields were from 13.0% to 80.9% for plutonium-236 and from 10.5% to 100% for americium-241. Determined specific activities of plutonium-239,240 were from (2.3 ± 1.4) mBq/kg to (82 ± 29) mBq/kg, the specific activities of plutonium-238 were from (14.2 ± 3.7) mBq/kg to (708 ± 86) mBq/kg. The specific activities of americium-241 were from (1.4 ± 0.9) mBq/kg to (3360 ± 210) mBq/kg. The fishes from Baltic Sea as well as from North Sea show highest specific activities then fresh-water fishes from Slovakia. Therefore the monitoring of alpha radionuclides in foods imported from territories with nuclear testing is recommended

  10. The Polerovirus Minor Capsid Protein Determines Vector Specificity and Intestinal Tropism in the Aphid

    Science.gov (United States)

    Brault, Véronique; Périgon, Sophie; Reinbold, Catherine; Erdinger, Monique; Scheidecker, Danièle; Herrbach, Etienne; Richards, Ken; Ziegler-Graff, Véronique

    2005-01-01

    Aphid transmission of poleroviruses is highly specific, but the viral determinants governing this specificity are unknown. We used a gene exchange strategy between two poleroviruses with different vectors, Beet western yellows virus (BWYV) and Cucurbit aphid-borne yellows virus (CABYV), to analyze the role of the major and minor capsid proteins in vector specificity. Virus recombinants obtained by exchanging the sequence of the readthrough domain (RTD) between the two viruses replicated in plant protoplasts and in whole plants. The hybrid readthrough protein of chimeric viruses was incorporated into virions. Aphid transmission experiments using infected plants or purified virions revealed that vector specificity is driven by the nature of the RTD. BWYV and CABYV have specific intestinal sites in the vectors for endocytosis: the midgut for BWYV and both midgut and hindgut for CABYV. Localization of hybrid virions in aphids by transmission electron microscopy revealed that gut tropism is also determined by the viral origin of the RTD. PMID:16014930

  11. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  12. Accuracy Of The Place Of Crimes Determination As A Factor Of The Structure Of Special Part Of The Criminal Code Of The Russian Federation Preservation

    Directory of Open Access Journals (Sweden)

    Inga V. Pantyuxina

    2014-06-01

    Full Text Available In the present article the problem of nonobservance of logical and legal means, receptions and rules of the separate structures of crimes in the structure of criminal law placement which breaks systemacity of the Special part on the Criminal Code of the Russian Federation is raised. Questions of crimes regulated in articles 138.1, 214, 226.1, 240 and 242.2 of the Criminal Code of the Russian Federation in the structure of the Special part of the Criminal Code of the Russian Federation is researched and analyzed. Researching their objects and a number of signs of the objective party, authors focus attention on the shift of priorities in these structures of crimes from the public relations regulated in them into the favor of less significant out of what the failure to place their criminal and legal material in the Special part of the Criminal Code of the Russian Federation. Discrepancy of objects in crimes regulated by articles 138.1 and 242.2 of the Criminal Code of the Russian Federation to the patrimonial and specific objects of the groups of crimes in which they are located in this connection, due to what it is offered to move them to other groups of crimes. Analysis of crimes structures regulated in the article 214, 226.1 and 240 of the Criminal Code of the Russian Federation led to the conclusion that is reasonable to allocate independent structures of crimes and to define other legal niches of the Criminal Code of the Russian Federation, other than those in which they are located.

  13. A Method for Determining Optimal Residential Energy Efficiency Packages

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gestwick, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bianchi, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Anderson, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Judkoff, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2011-04-01

    This report describes an analysis method for determining optimal residential energy efficiency retrofit packages and, as an illustrative example, applies the analysis method to a 1960s-era home in eight U.S. cities covering a range of International Energy Conservation Code (IECC) climate regions. The method uses an optimization scheme that considers average energy use (determined from building energy simulations) and equivalent annual cost to recommend optimal retrofit packages specific to the building, occupants, and location.

  14. Through analysis of LOFT L2-3 by THYDE-P code

    International Nuclear Information System (INIS)

    Hirano, Masashi

    1981-10-01

    A through calculation of Experiment L2-3 of the Loss-of-Fluid Test (LOFT) Facility Power Ascension Series (Experiment Series L2) was performed with the THYDE-P code. The specific objectives of Experiment L2-3 were to determine the thermal-hydraulic behavior of the nuclear core and the thermal-mechanical response of the fuel rod cladding with a maximum linear heat generation rate of 39.4 kW/m. The THYDE-P code is a computer code to analyze both the blowdown and refill-reflood phases of loss-of-coolant accidents (LOCAs) of pressurized water reactors (PWRs) without a change in the methods and the models and is now under verification study and modification. The present calculation was performed by best estimate (BE) options as Sample Calculation Run 40, which is a portion of a series of THYDE-P sample calculations. The calculation was carried out from test initiation until complete submersion of the core volume with subcooled water, i.e. about 60 sec. The trend of the calculated cladding surface temperature was in good agreement with that of the experimental results. (author)

  15. An introduction to using QR codes in scholarly journals

    Directory of Open Access Journals (Sweden)

    Jae Hwa Chang

    2014-08-01

    Full Text Available The Quick Response (QR code was first developed in 1994 by Denso Wave Incorporated, Japan. From that point on, it came into general use as an identification mark for all kinds of commercial products, advertisements, and other public announcements. In scholarly journals, the QR code is used to provide immediate direction to the journal homepage or specific content such as figures or videos. To produce a QR code and print it in the print version or upload to the web is very simple. Using a QR code producing program, an editor can add simple information to a website. After that, a QR code is produced. A QR code is very stable, such that it can be used for a long time without loss of quality. Producing and adding QR codes to a journal costs nothing; therefore, to increase the visibility of their journals, it is time for editors to add QR codes to their journals.

  16. Neuronal codes for visual perception and memory.

    Science.gov (United States)

    Quian Quiroga, Rodrigo

    2016-03-01

    In this review, I describe and contrast the representation of stimuli in visual cortical areas and in the medial temporal lobe (MTL). While cortex is characterized by a distributed and implicit coding that is optimal for recognition and storage of semantic information, the MTL shows a much sparser and explicit coding of specific concepts that is ideal for episodic memory. I will describe the main characteristics of the coding in the MTL by the so-called concept cells and will then propose a model of the formation and recall of episodic memory based on partially overlapping assemblies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Electrical, instrumentation, and control codes and standards

    International Nuclear Information System (INIS)

    Kranning, A.N.

    1978-01-01

    During recent years numerous documents in the form of codes and standards have been developed and published to provide design, fabrication and construction rules and criteria applicable to instrumentation, control and power distribution facilities for nuclear power plants. The contents of this LTR were prepared by NUS Corporation under Subcontract K5108 and provide a consolidated index and listing of the documents selected for their application to procurement of materials and design of modifications and new construction at the LOFT facility. These codes and standards should be applied together with the National Electrical Code, the ID Engineering Standards and LOFT Specifications to all LOFT instrument and electrical design activities

  18. Statistical mechanics of low-density parity-check codes

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)

    2004-02-13

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  19. Entanglement-assisted quantum low-density parity-check codes

    International Nuclear Information System (INIS)

    Fujiwara, Yuichiro; Clark, David; Tonchev, Vladimir D.; Vandendriessche, Peter; De Boeck, Maarten

    2010-01-01

    This article develops a general method for constructing entanglement-assisted quantum low-density parity-check (LDPC) codes, which is based on combinatorial design theory. Explicit constructions are given for entanglement-assisted quantum error-correcting codes with many desirable properties. These properties include the requirement of only one initial entanglement bit, high error-correction performance, high rates, and low decoding complexity. The proposed method produces several infinite families of codes with a wide variety of parameters and entanglement requirements. Our framework encompasses the previously known entanglement-assisted quantum LDPC codes having the best error-correction performance and many other codes with better block error rates in simulations over the depolarizing channel. We also determine important parameters of several well-known classes of quantum and classical LDPC codes for previously unsettled cases.

  20. Statistical mechanics of low-density parity-check codes

    International Nuclear Information System (INIS)

    Kabashima, Yoshiyuki; Saad, David

    2004-01-01

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  1. Non-Coding RNAs in Hodgkin Lymphoma

    Directory of Open Access Journals (Sweden)

    Anna Cordeiro

    2017-05-01

    Full Text Available MicroRNAs (miRNAs, small non-coding RNAs that regulate gene expression by binding to the 3’-UTR of their target genes, can act as oncogenes or tumor suppressors. Recently, other types of non-coding RNAs—piwiRNAs and long non-coding RNAs—have also been identified. Hodgkin lymphoma (HL is a B cell origin disease characterized by the presence of only 1% of tumor cells, known as Hodgkin and Reed-Stenberg (HRS cells, which interact with the microenvironment to evade apoptosis. Several studies have reported specific miRNA signatures that can differentiate HL lymph nodes from reactive lymph nodes, identify histologic groups within classical HL, and distinguish HRS cells from germinal center B cells. Moreover, some signatures are associated with survival or response to chemotherapy. Most of the miRNAs in the signatures regulate genes related to apoptosis, cell cycle arrest, or signaling pathways. Here we review findings on miRNAs in HL, as well as on other non-coding RNAs.

  2. Cerebellum-specific and age-dependent expression of an endogenous retrovirus with intact coding potential

    Directory of Open Access Journals (Sweden)

    Itoh Takayuki

    2011-10-01

    Full Text Available Abstract Background Endogenous retroviruses (ERVs, including murine leukemia virus (MuLV type-ERVs (MuLV-ERVs, are presumed to occupy ~10% of the mouse genome. In this study, following the identification of a full-length MuLV-ERV by in silico survey of the C57BL/6J mouse genome, its distribution in different mouse strains and expression characteristics were investigated. Results Application of a set of ERV mining protocols identified a MuLV-ERV locus with full coding potential on chromosome 8 (named ERVmch8. It appears that ERVmch8 shares the same genomic locus with a replication-incompetent MuLV-ERV, called Emv2; however, it was not confirmed due to a lack of relevant annotation and Emv2 sequence information. The ERVmch8 sequence was more prevalent in laboratory strains compared to wild-derived strains. Among 16 different tissues of ~12 week-old female C57BL/6J mice, brain homogenate was the only tissue with evident expression of ERVmch8. Further ERVmch8 expression analysis in six different brain compartments and four peripheral neuronal tissues of C57BL/6J mice revealed no significant expression except for the cerebellum in which the ERVmch8 locus' low methylation status was unique compared to the other brain compartments. The ERVmch8 locus was found to be surrounded by genes associated with neuronal development and/or inflammation. Interestingly, cerebellum-specific ERVmch8 expression was age-dependent with almost no expression at 2 weeks and a plateau at 6 weeks. Conclusions The ecotropic ERVmch8 locus on the C57BL/6J mouse genome was relatively undermethylated in the cerebellum, and its expression was cerebellum-specific and age-dependent.

  3. Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.

    Science.gov (United States)

    Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk

    2018-07-01

    Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.

  4. Inclusion of models to describe severe accident conditions in the fuel simulation code DIONISIO

    Energy Technology Data Exchange (ETDEWEB)

    Lemes, Martín; Soba, Alejandro [Sección Códigos y Modelos, Gerencia Ciclo del Combustible Nuclear, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina); Daverio, Hernando [Gerencia Reactores y Centrales Nucleares, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina); Denis, Alicia [Sección Códigos y Modelos, Gerencia Ciclo del Combustible Nuclear, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina)

    2017-04-15

    The simulation of fuel rod behavior is a complex task that demands not only accurate models to describe the numerous phenomena occurring in the pellet, cladding and internal rod atmosphere but also an adequate interconnection between them. In the last years several models have been incorporated to the DIONISIO code with the purpose of increasing its precision and reliability. After the regrettable events at Fukushima, the need for codes capable of simulating nuclear fuels under accident conditions has come forth. Heat removal occurs in a quite different way than during normal operation and this fact determines a completely new set of conditions for the fuel materials. A detailed description of the different regimes the coolant may exhibit in such a wide variety of scenarios requires a thermal-hydraulic formulation not suitable to be included in a fuel performance code. Moreover, there exist a number of reliable and famous codes that perform this task. Nevertheless, and keeping in mind the purpose of building a code focused on the fuel behavior, a subroutine was developed for the DIONISIO code that performs a simplified analysis of the coolant in a PWR, restricted to the more representative situations and provides to the fuel simulation the boundary conditions necessary to reproduce accidental situations. In the present work this subroutine is described and the results of different comparisons with experimental data and with thermal-hydraulic codes are offered. It is verified that, in spite of its comparative simplicity, the predictions of this module of DIONISIO do not differ significantly from those of the specific, complex codes.

  5. Relative Expression Levels Rather Than Specific Activity Plays the Major Role in Determining In Vivo AKT Isoform Substrate Specificity

    Directory of Open Access Journals (Sweden)

    Rachel S. Lee

    2011-01-01

    Full Text Available The AKT protooncogene mediates many cellular processes involved in normal development and disease states such as cancer. The three structurally similar isoforms: AKT1, AKT2, and AKT3 exhibit both functional redundancy and isoform-specific functions; however the basis for their differential signalling remains unclear. Here we show that in vitro, purified AKT3 is ∼47-fold more active than AKT1 at phosphorylating peptide and protein substrates. Despite these marked variations in specific activity between the individual isoforms, a comprehensive analysis of phosphorylation of validated AKT substrates indicated only subtle differences in signalling via individual isoforms in vivo. Therefore, we hypothesise, at least in this model system, that relative tissue/cellular abundance, rather than specific activity, plays the dominant role in determining AKT substrate specificity in situ.

  6. RH-TRU Waste Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2007-07-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  7. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  8. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.

  9. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    International Nuclear Information System (INIS)

    Arndt, S.A.

    1997-01-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities

  10. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.

    Science.gov (United States)

    Weems, Shelley; Heller, Pamela; Fenton, Susan H

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.

  11. The Impact of Disability and Social Determinants of Health on Condition-Specific Readmissions beyond Medicare Risk Adjustments: A Cohort Study.

    Science.gov (United States)

    Meddings, Jennifer; Reichert, Heidi; Smith, Shawna N; Iwashyna, Theodore J; Langa, Kenneth M; Hofer, Timothy P; McMahon, Laurence F

    2017-01-01

    Readmission rates after pneumonia, heart failure, and acute myocardial infarction hospitalizations are risk-adjusted for age, gender, and medical comorbidities and used to penalize hospitals. To assess the impact of disability and social determinants of health on condition-specific readmissions beyond current risk adjustment. Retrospective cohort study of Medicare patients using 1) linked Health and Retirement Study-Medicare claims data (HRS-CMS) and 2) Healthcare Cost and Utilization Project State Inpatient Databases (Florida, Washington) linked with ZIP Code-level measures from the Census American Community Survey (ACS-HCUP). Multilevel logistic regression models assessed the impact of disability and selected social determinants of health on readmission beyond current risk adjustment. Outcomes measured were readmissions ≤30 days after hospitalizations for pneumonia, heart failure, or acute myocardial infarction. HRS-CMS models included disability measures (activities of daily living [ADL] limitations, cognitive impairment, nursing home residence, home healthcare use) and social determinants of health (spouse, children, wealth, Medicaid, race). ACS-HCUP model measures were ZIP Code-percentage of residents ≥65 years of age with ADL difficulty, spouse, income, Medicaid, and patient-level and hospital-level race. For pneumonia, ≥3 ADL difficulties (OR 1.61, CI 1.079-2.391) and prior home healthcare needs (OR 1.68, CI 1.204-2.355) increased readmission in HRS-CMS models (N = 1631); ADL difficulties (OR 1.20, CI 1.063-1.352) and 'other' race (OR 1.14, CI 1.001-1.301) increased readmission in ACS-HCUP models (N = 27,297). For heart failure, children (OR 0.66, CI 0.437-0.984) and wealth (OR 0.53, CI 0.349-0.787) lowered readmission in HRS-CMS models (N = 2068), while black (OR 1.17, CI 1.056-1.292) and 'other' race (OR 1.14, CI 1.036-1.260) increased readmission in ACS-HCUP models (N = 37,612). For acute myocardial infarction, nursing home status

  12. An analysis of reproducibility and non-determinism in HEP software and ROOT data

    Science.gov (United States)

    Ivie, Peter; Zheng, Charles; Lannon, Kevin; Thain, Douglas

    2017-10-01

    Reproducibility is an essential component of the scientific method. In order to validate the correctness or facilitate the extension of a computational result, it should be possible to re-run a published result and verify that the same results are produced. However, reproducing a computational result is surprisingly difficult: non-determinism and other factors may make it impossible to get the same result, even when running the same code on the same machine on the same day. We explore this problem in the context of HEP codes and data, showing three high level methods for dealing with non-determinism in general: 1) Domain specific methods; 2) Domain specific comparisons; and 3) Virtualization adjustments. Using a CMS workflow with output data stored in ROOT files, we use these methods to prevent, detect, and eliminate some sources of non-determinism. We observe improved determinism using pre-determined random seeds, a predictable progression of system timestamps, and fixed process identifiers. Unfortunately, sources of non-determinism continue to exist despite the combination of all three methods. Hierarchical data comparisons also allow us to appropriately ignore some non-determinism when it is unavoidable. We conclude that there is still room for improvement, and identify directions that can be taken in each method to make an experiment more reproducible.

  13. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  14. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    Science.gov (United States)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  15. Tracking Code for Microwave Instability

    International Nuclear Information System (INIS)

    Heifets, S.; SLAC

    2006-01-01

    To study microwave instability the tracking code is developed. For bench marking, results are compared with Oide-Yokoya results [1] for broad-band Q = 1 impedance. Results hint to two possible mechanisms determining the threshold of instability

  16. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  17. TSOAK-M1: a computer code to determine tritium reaction/adsorption/release parameters from experimental results of air-detritiation tests

    International Nuclear Information System (INIS)

    Land, R.H.; Maroni, V.A.; Minkoff, M.

    1979-01-01

    A computer code has been developed which permits the determination of tritium reaction (T 2 to HTO)/adsorption/release and instrument correction parameters from enclosure (building) - detritiation test data. The code is based on a simplified model which treats each parameter as a normalized time-independent constant throughout the data-unfolding steps. Because of the complicated four-dimensional mathematical surface generated by the resulting differential equation system, occasional local-minima effects are observed, but these effects can be overcome in most instances by selecting a series of trial guesses for the initial parameter values and observing the reproducibility of final parameter values for cases where the best overall fit to experimental data is achieved. The code was then used to analyze existing small-cubicle test data with good success, and the resulting normalized parameters were employed to evaluate hypothetical reactor-building detritiation scenarios. It was concluded from the latter evaluation that the complications associated with moisture formation, adsorption, and release, particularly in terms of extended cleanup times, may not be as great as was previously thought. It is recommended that the validity of the TSOAK-M1 model be tested using data from detritiation tests conducted on large experimental enclosures (5 to 10 cm 3 ) and, if possible, actual facility buildings

  18. Code Help: Can This Unique State Regulatory Intervention Improve Emergency Department Crowding?

    Science.gov (United States)

    Michael, Sean S; Broach, John P; Kotkowski, Kevin A; Brush, D Eric; Volturo, Gregory A; Reznek, Martin A

    2018-05-01

    Emergency department (ED) crowding adversely affects multiple facets of high-quality care. The Commonwealth of Massachusetts mandates specific, hospital action plans to reduce ED boarding via a mechanism termed "Code Help." Because implementation appears inconsistent even when hospital conditions should have triggered its activation, we hypothesized that compliance with the Code Help policy would be associated with reduction in ED boarding time and total ED length of stay (LOS) for admitted patients, compared to patients seen when the Code Help policy was not followed. This was a retrospective analysis of data collected from electronic, patient-care, timestamp events and from a prospective Code Help registry for consecutive adult patients admitted from the ED at a single academic center during a 15-month period. For each patient, we determined whether the concurrent hospital status complied with the Code Help policy or violated it at the time of admission decision. We then compared ED boarding time and overall ED LOS for patients cared for during periods of Code Help policy compliance and during periods of Code Help policy violation, both with reference to patients cared for during normal operations. Of 89,587 adult patients who presented to the ED during the study period, 24,017 (26.8%) were admitted to an acute care or critical care bed. Boarding time ranged from zero to 67 hours 30 minutes (median 4 hours 31 minutes). Total ED LOS for admitted patients ranged from 11 minutes to 85 hours 25 minutes (median nine hours). Patients admitted during periods of Code Help policy violation experienced significantly longer boarding times (median 20 minutes longer) and total ED LOS (median 46 minutes longer), compared to patients admitted under normal operations. However, patients admitted during Code Help policy compliance did not experience a significant increase in either metric, compared to normal operations. In this single-center experience, implementation of the

  19. Specific model for a gas distribution analysis in the containment at Almaraz NPP using GOTHIC computer code

    International Nuclear Information System (INIS)

    García González, M.; García Jiménez, P.; Martínez Domínguez, F.

    2016-01-01

    To carry out an analysis of the distribution of gases within the containment building at the CN Almaraz site, a simulation model with the thermohydraulic GOTHIC [1] code has been used. This has been assessed with a gas control system based on passive autocatalytic recombiners (PARs). The model is used to test the effectiveness of the control systems for gases to be used in the Almaraz Nuclear Power Plant, Uits I&II (Caceres, Spain, 1,035 MW and 1,044 MW). The model must confirm the location and number of the recombiners proposed to be installed. It is an essential function of the gas control system to avoid any formation of explosive atmospheres by reducing and limiting the concentration of combustible gases during an accident, thus maintaining the integrity of the containment. The model considers severe accident scenarios with specific conditions that produce the most onerous generation of combustible gases.

  20. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    International Nuclear Information System (INIS)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions

  1. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  2. Moderate sensitivity and high specificity of emergency department administrative data for transient ischemic attacks.

    Science.gov (United States)

    Yu, Amy Y X; Quan, Hude; McRae, Andrew; Wagner, Gabrielle O; Hill, Michael D; Coutts, Shelagh B

    2017-09-18

    Validation of administrative data case definitions is key for accurate passive surveillance of disease. Transient ischemic attack (TIA) is a condition primarily managed in the emergency department. However, prior validation studies have focused on data after inpatient hospitalization. We aimed to determine the validity of the Canadian 10th International Classification of Diseases (ICD-10-CA) codes for TIA in the national ambulatory administrative database. We performed a diagnostic accuracy study of four ICD-10-CA case definition algorithms for TIA in the emergency department setting. The study population was obtained from two ongoing studies on the diagnosis of TIA and minor stroke versus stroke mimic using serum biomarkers and neuroimaging. Two reference standards were used 1) the emergency department clinical diagnosis determined by chart abstractors and 2) the 90-day final diagnosis, both obtained by stroke neurologists, to calculate the sensitivity, specificity, positive and negative predictive values (PPV and NPV) of the ICD-10-CA algorithms for TIA. Among 417 patients, emergency department adjudication showed 163 (39.1%) TIA, 155 (37.2%) ischemic strokes, and 99 (23.7%) stroke mimics. The most restrictive algorithm, defined as a TIA code in the main position had the lowest sensitivity (36.8%), but highest specificity (92.5%) and PPV (76.0%). The most inclusive algorithm, defined as a TIA code in any position with and without query prefix had the highest sensitivity (63.8%), but lowest specificity (81.5%) and PPV (68.9%). Sensitivity, specificity, PPV, and NPV were overall lower when using the 90-day diagnosis as reference standard. Emergency department administrative data reflect diagnosis of suspected TIA with high specificity, but underestimate the burden of disease. Future studies are necessary to understand the reasons for the low to moderate sensitivity.

  3. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  4. A toroidal plasma MHD equilibrium code 'EQUCIR version 1'

    International Nuclear Information System (INIS)

    Ninomiya, Hiromasa; Shinya, Kichiro; Kameari, Akihisa.

    1980-10-01

    A new free-boundary toroidal MHD equilibrium code ''EQUCIR version 1'' has been developed. The central problems approached by this code is as follows: 1) The magnetic flux distribution of a plasma at equilibrium is determined in the given external field. 2) A set of circuit equations between the plasma and the external conductors are constructed. These circuit equations and the Grad-Shafranov equation are solved self-consistently and the time evolutions of plasma equilibria and currents in external conductors are determined at the same time. 3) The currents in the external conductors are determined so that the plasma cross-section and plasma parameters are to be maintained with desired ones. It is shown that this code is very useful for study of the tokamak plasma equilibria, for design of the poloidal coil system and for investigation of experimental results. (author)

  5. A simple in-surge pressure analysis using the SPACE code

    International Nuclear Information System (INIS)

    Youn, Bum Soo; Kim, Yo Han; Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2010-01-01

    Currently, nuclear safety analysis codes used in Korea are developed by all the overseas. These codes are paying huge fee and permission must be obtained for use in the country. In addition, orders for nuclear power plants must ensure the safety analysis code for independent domestic technology. Therefore, Korea Electric Power Research Institute(KEPRI) is developing the domestic nuclear power safety analysis, SPACE(Safety and Performance Analysis Code for nuclear power plants). To determine the computational power of pressurizer model in development SPACE code, it was compared with existing commercial nuclear power safety analysis code, RETRAN

  6. The insular taste cortex contributes to odor quality coding

    Directory of Open Access Journals (Sweden)

    Maria G Veldhuizen

    2010-07-01

    Full Text Available Despite distinct peripheral and central pathways, stimulation of both the olfactory and the gustatory systems may give rise to the sensation of sweetness. Whether there is a common central mechanism producing sweet quality sensations or two discrete mechanisms associated independently with gustatory and olfactory stimuli is currently unknown. Here we used fMRI to determine whether odor sweetness is represented in the piriform olfactory cortex, which is thought to code odor quality, or in the insular taste cortex, which is thought to code taste quality. Fifteen participants sampled two concentrations of a pure sweet taste (sucrose, two sweet food odors (chocolate and strawberry, and two sweet floral odors (lilac and rose. Replicating prior work we found that olfactory stimulation activated the piriform, orbitofrontal and insular cortices. Of these regions, only the insula also responded to sweet taste. More importantly, the magnitude of the response to the food odors, but not to the non-food odors, in this region of insula was positively correlated with odor sweetness rating. These findings demonstrate that insular taste cortex contributes to odor quality coding by representing the taste-like aspects of food odors. Since the effect was specific to the food odors, and only food odors are experienced with taste, we suggest this common central mechanism develops as a function of experiencing flavors.

  7. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  8. The one-dimensional transport code CHET2, taking into account nonlinear, element-specific equilibrium sorption

    International Nuclear Information System (INIS)

    Luehrmann, L.; Noseck, U.

    1996-03-01

    While the verification report on CHET1 primarily focused on aspects such as the correctness of algorithms with respect to the modeling of advection, dispersion and diffusion, the report in hand is intended to primarily deal with nonlinear sorption and numerical sorption modeling. Another aspect discussed is the correct treatment of decay within established radioactive decay chains. First, the physical fundamentals are explained of the processes determining the radionuclide transport in the cap rock, and hence are the basis of the program discussed. The numeric algorithms the CHET2 code is based are explained, showing the details of realisation and the function of the various defaults and corrections. The iterative coupling of transport and sorption computation is illustrated by means of a program flowchart. Furthermore, the actvities for verification of the program are explained, as well as qualitative effects of computations assuming concentration-dependent sorption. The computation of the decay within decay chains is verified, and application programming using nonlinear sorption isotherms as well as the entire process of transport calculations with CHET2 are shown. (orig./DG) [de

  9. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    information and coding theory. A large scale relay computer had failed to deliver the expected results due to a hardware fault. Hamming, one of the active proponents of computer usage, was determined to find an efficient means by which computers could detect and correct their own faults. A mathematician by train-.

  10. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  11. Extensions of the 3-dimensional plasma transport code E3D

    International Nuclear Information System (INIS)

    Runov, A.; Schneider, R.; Kasilov, S.; Reiter, D.

    2004-01-01

    One important aspect of modern fusion research is plasma edge physics. Fluid transport codes extending beyond the standard 2-D code packages like B2-Eirene or UEDGE are under development. A 3-dimensional plasma fluid code, E3D, based upon the Multiple Coordinate System Approach and a Monte Carlo integration procedure has been developed for general magnetic configurations including ergodic regions. These local magnetic coordinates lead to a full metric tensor which accurately accounts for all transport terms in the equations. Here, we discuss new computational aspects of the realization of the algorithm. The main limitation to the Monte Carlo code efficiency comes from the restriction on the parallel jump of advancing test particles which must be small compared to the gradient length of the diffusion coefficient. In our problems, the parallel diffusion coefficient depends on both plasma and magnetic field parameters. Usually, the second dependence is much more critical. In order to allow long parallel jumps, this dependence can be eliminated in two steps: first, the longitudinal coordinate x 3 of local magnetic coordinates is modified in such a way that in the new coordinate system the metric determinant and contra-variant components of the magnetic field scale along the magnetic field with powers of the magnetic field module (like in Boozer flux coordinates). Second, specific weights of the test particles are introduced. As a result of increased parallel jump length, the efficiency of the code is about two orders of magnitude better. (copyright 2004 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  12. NASA Lewis steady-state heat pipe code users manual

    International Nuclear Information System (INIS)

    Tower, L.K.

    1992-06-01

    The NASA Lewis heat pipe code has been developed to predict the performance of heat pipes in the steady state. The code can be used as a design tool on a personal computer or, with a suitable calling routine, as a subroutine for a mainframe radiator code. A variety of wick structures, including a user input option, can be used. Heat pipes with multiple evaporators, condensers, and adiabatic sections in series and with wick structures that differ among sections can be modeled. Several working fluids can be chosen, including potassium, sodium, and lithium, for which the monomer-dimer equilibrium is considered. The code incorporates a vapor flow algorithm that treats compressibility and axially varying heat input. This code facilitates the determination of heat pipe operating temperatures and heat pipe limits that may be encountered at the specified heat input and environment temperature. Data are input to the computer through a user-interactive input subroutine. Output, such as liquid and vapor pressures and temperatures, is printed at equally spaced axial positions along the pipe as determined by the user

  13. Typical performance of regular low-density parity-check codes over general symmetric channels

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Toshiyuki [Department of Electronics and Information Engineering, Tokyo Metropolitan University, 1-1 Minami-Osawa, Hachioji-shi, Tokyo 192-0397 (Japan); Saad, David [Neural Computing Research Group, Aston University, Aston Triangle, Birmingham B4 7ET (United Kingdom)

    2003-10-31

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models.

  14. Typical performance of regular low-density parity-check codes over general symmetric channels

    International Nuclear Information System (INIS)

    Tanaka, Toshiyuki; Saad, David

    2003-01-01

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models

  15. Determinants of intra-specific variation in basal metabolic rate.

    Science.gov (United States)

    Konarzewski, Marek; Książek, Aneta

    2013-01-01

    Basal metabolic rate (BMR) provides a widely accepted benchmark of metabolic expenditure for endotherms under laboratory and natural conditions. While most studies examining BMR have concentrated on inter-specific variation, relatively less attention has been paid to the determinants of within-species variation. Even fewer studies have analysed the determinants of within-species BMR variation corrected for the strong influence of body mass by appropriate means (e.g. ANCOVA). Here, we review recent advancements in studies on the quantitative genetics of BMR and organ mass variation, along with their molecular genetics. Next, we decompose BMR variation at the organ, tissue and molecular level. We conclude that within-species variation in BMR and its components have a clear genetic signature, and are functionally linked to key metabolic process at all levels of biological organization. We highlight the need to integrate molecular genetics with conventional metabolic field studies to reveal the adaptive significance of metabolic variation. Since comparing gene expressions inter-specifically is problematic, within-species studies are more likely to inform us about the genetic underpinnings of BMR. We also urge for better integration of animal and medical research on BMR; the latter is quickly advancing thanks to the application of imaging technologies and 'omics' studies. We also suggest that much insight on the biochemical and molecular underpinnings of BMR variation can be gained from integrating studies on the mammalian target of rapamycin (mTOR), which appears to be the major regulatory pathway influencing the key molecular components of BMR.

  16. Locally Minimum Storage Regenerating Codes in Distributed Cloud Storage Systems

    Institute of Scientific and Technical Information of China (English)

    Jing Wang; Wei Luo; Wei Liang; Xiangyang Liu; Xiaodai Dong

    2017-01-01

    In distributed cloud storage sys-tems, inevitably there exist multiple node fail-ures at the same time. The existing methods of regenerating codes, including minimum storage regenerating (MSR) codes and mini-mum bandwidth regenerating (MBR) codes, are mainly to repair one single or several failed nodes, unable to meet the repair need of distributed cloud storage systems. In this paper, we present locally minimum storage re-generating (LMSR) codes to recover multiple failed nodes at the same time. Specifically, the nodes in distributed cloud storage systems are divided into multiple local groups, and in each local group (4, 2) or (5, 3) MSR codes are constructed. Moreover, the grouping method of storage nodes and the repairing process of failed nodes in local groups are studied. The-oretical analysis shows that LMSR codes can achieve the same storage overhead as MSR codes. Furthermore, we verify by means of simulation that, compared with MSR codes, LMSR codes can reduce the repair bandwidth and disk I/O overhead effectively.

  17. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  18. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  19. Determination of the radioactive inventory of a fuel assembly from a U3O8 design core using ORIGEN 2.1 code

    International Nuclear Information System (INIS)

    Castro, Jose; Ticona, Braulio; Madariaga, Marcelo

    2014-01-01

    This paper shows a methodology to determine the radioactive inventory of a fuel assembly of the RP-10 design core, which was proposed in 1988, using the ORIGEN 2.1 code, which allows to determine the activity of the 52 most characteristic fission products, its growth in activity during reactor operation under the terms of the design and evolution of decay of the fission products after 4 hours after the reactor shutdown, which conservatively, a fuel element represents an average fraction of the considered power in the radioactive inventory assessment. (authors).

  20. Elucidating Article 45.6 of the International Code of Zoological Nomenclature: a dichotomous key for the determination of subspecific or infrasubspecific rank.

    Science.gov (United States)

    Lingafelter, Steven W; Nearns, Eugenio H

    2013-01-01

    We present an overview of the difficulties sometimes encountered when determining whether a published name following a binomen is available or infrasubspecific and unavailable, following Article 45.6 of the International Code of Zoological Nomenclature (ICZN, 1999). We propose a dichotomous key that facilitates this determination and as a preferable method, given the convoluted and subordinate discussion, exceptions, and qualifications laid out in ICZN (1999: 49-50). Examples and citations are provided for each case one can encounter while making this assessment of availability status of names following the binomen.

  1. Interface requirements to couple thermal-hydraulic codes to severe accident codes: ATHLET-CD

    Energy Technology Data Exchange (ETDEWEB)

    Trambauer, K. [GRS, Garching (Germany)

    1997-07-01

    The system code ATHLET-CD is being developed by GRS in cooperation with IKE and IPSN. Its field of application comprises the whole spectrum of leaks and large breaks, as well as operational and abnormal transients for LWRs and VVERs. At present the analyses cover the in-vessel thermal-hydraulics, the early phases of core degradation, as well as fission products and aerosol release from the core and their transport in the Reactor Coolant System. The aim of the code development is to extend the simulation of core degradation up to failure of the reactor pressure vessel and to cover all physically reasonable accident sequences for western and eastern LWRs including RMBKs. The ATHLET-CD structure is highly modular in order to include a manifold spectrum of models and to offer an optimum basis for further development. The code consists of four general modules to describe the reactor coolant system thermal-hydraulics, the core degradation, the fission product core release, and fission product and aerosol transport. Each general module consists of some basic modules which correspond to the process to be simulated or to its specific purpose. Besides the code structure based on the physical modelling, the code follows four strictly separated steps during the course of a calculation: (1) input of structure, geometrical data, initial and boundary condition, (2) initialization of derived quantities, (3) steady state calculation or input of restart data, and (4) transient calculation. In this paper, the transient solution method is briefly presented and the coupling methods are discussed. Three aspects have to be considered for the coupling of different modules in one code system. First is the conservation of masses and energy in the different subsystems as there are fluid, structures, and fission products and aerosols. Second is the convergence of the numerical solution and stability of the calculation. The third aspect is related to the code performance, and running time.

  2. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  3. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  4. A theoretical model to determine the capacity performance of shape-specific electrodes

    Science.gov (United States)

    Yue, Yuan; Liang, Hong

    2018-06-01

    A theory is proposed to explain and predict the electrochemical process during reaction between lithium ions and electrode materials. In the model, the process of reaction is proceeded into two steps, surface adsorption and diffusion of lithium ions. The surface adsorption is an instantaneous process for lithium ions to adsorb onto the surface sites of active materials. The diffusion of lithium ions into particles is determined by the charge-discharge condition. A formula to determine the maximum specific capacity of active materials at different charging rates (C-rates) is derived. The maximum specific capacity is correlated to characteristic parameters of materials and cycling - such as size, aspect ratio, surface area, and C-rate. Analysis indicates that larger particle size or greater aspect ratio of active materials and faster C-rates can reduce maximum specific capacity. This suggests that reducing particle size of active materials and slowing the charge-discharge speed can provide enhanced electrochemical performance of a battery cell. Furthermore, the model is validated by published experimental results. This model brings new understanding in quantification of electrochemical kinetics and capacity performance. It enables development of design strategies for novel electrodes and future generation of energy storage devices.

  5. On transform coding tools under development for VP10

    Science.gov (United States)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  6. Determination of IgE antibodies to the benzylpenicilloyl determinant: a comparison of the sensitivity and specificity of three radio allergo sorbent test methods.

    Science.gov (United States)

    Garcia, J J; Blanca, M; Moreno, F; Vega, J M; Mayorga, C; Fernandez, J; Juarez, C; Romano, A; de Ramon, E

    1997-01-01

    The quantitation of in vitro IgE antibodies to the benzylpenicilloyl determinant (BPO) is a useful tool for evaluating suspected penicillin allergic subjects. Although many different methods have been employed, few studies have compared their diagnostic specificity and sensitivity. In this study, the sensitivity and specificity of three different radio allergo sorbent test (RAST) methods for quantitating specific IgE antibodies to the BPO determinant were compared. Thirty positive control sera (serum samples from penicillin allergic subjects with a positive clinical history and a positive penicillin skin test) and 30 negative control sera (sera from subjects with no history of penicillin allergy and negative skin tests) were tested for BPO-specific IgE antibodies by RAST using three different conjugates coupled to the solid phase: benzylpenicillin conjugated to polylysine (BPO-PLL), benzylpenicillin conjugated to human serum albumin (BPO-HSA), and benzylpenicillin conjugated to an aminospacer (BPO-SP). Receiver operator control curves (ROC analysis) were carried out by determining different cut-off points between positive and negative values. Contingence tables were constructed and sensitivity, specificity, negative predictive values (PV-), and positive predictive values (PV+) were calculated. Pearson correlation coefficients (r) and intraclass correlation coefficients (ICC) were determined and the differences between methods were compared by chi 2 analysis. Analysis of the areas defined by the ROC curves showed statistical differences among the three methods. When cut-off points for optimal sensitivity and specificity were chosen, the BPO-HSA assay was less sensitive and less specific and had a lower PV- and PV+ than the BPO-PLL and BPO-SP assays. Assessment of r and ICC indicated that the correlation was very high, but the concordance between the PLL and SP methods was higher than between the PLL and HSA or SP and HSA methods. We conclude that for quantitating Ig

  7. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  8. Maternal provision of non-sex-specific transformer messenger RNA in sex determination of the wasp Asobara tabida.

    Science.gov (United States)

    Geuverink, E; Verhulst, E C; van Leussen, M; van de Zande, L; Beukeboom, L W

    2018-02-01

    In many insect species maternal provision of sex-specifically spliced messenger RNA (mRNA) of sex determination genes is an essential component of the sex determination mechanism. In haplodiploid Hymenoptera, maternal provision in combination with genomic imprinting has been shown for the parasitoid Nasonia vitripennis, known as maternal effect genomic imprinting sex determination (MEGISD). Here, we characterize the sex determination cascade of Asobara tabida, another hymenopteran parasitoid. We show the presence of the conserved sex determination genes doublesex (dsx), transformer (tra) and transformer-2 (tra2) orthologues in As. tabida. Of these, At-dsx and At-tra are sex-specifically spliced, indicating a conserved function in sex determination. At-tra and At-tra2 mRNA is maternally provided to embryos but, in contrast to most studied insects, As. tabida females transmit a non-sex-specific splice form of At-tra mRNA to the eggs. In this respect, As. tabida sex determination differs from the MEGISD mechanism. How the paternal genome can induce female development in the absence of maternal provision of sex-specifically spliced mRNA remains an open question. Our study reports a hitherto unknown variant of maternal effect sex determination and accentuates the diversity of insect sex determination mechanisms. © 2017 The Authors. Insect Molecular Biology published by John Wiley & Sons Ltd on behalf of Royal Entomological Society.

  9. Validation of activity determination codes and nuclide vectors by using results from processing of retired components and operational waste

    International Nuclear Information System (INIS)

    Lundgren, Klas; Larsson, Arne

    2012-01-01

    Decommissioning studies for nuclear power reactors are performed in order to assess the decommissioning costs and the waste volumes as well as to provide data for the licensing and construction of the LILW repositories. An important part of this work is to estimate the amount of radioactivity in the different types of decommissioning waste. Studsvik ALARA Engineering has performed such assessments for LWRs and other nuclear facilities in Sweden. These assessments are to a large content depending on calculations, senior experience and sampling on the facilities. The precision in the calculations have been found to be relatively high close to the reactor core. Of natural reasons the precision will decline with the distance. Even if the activity values are lower the content of hard to measure nuclides can cause problems in the long term safety demonstration of LLW repositories. At the same time Studsvik is processing significant volumes of metallic and combustible waste from power stations in operation and in decommissioning phase as well as from other nuclear facilities such as research and waste treatment facilities. Combining the unique knowledge in assessment of radioactivity inventory and the large data bank the waste processing represents the activity determination codes can be validated and the waste processing analysis supported with additional data. The intention with this presentation is to highlight how the European nuclear industry jointly could use the waste processing data for validation of activity determination codes. (authors)

  10. RBMK-LOCA-Analyses with the ATHLET-Code

    Energy Technology Data Exchange (ETDEWEB)

    Petry, A. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH Kurfuerstendamm, Berlin (Germany); Domoradov, A.; Finjakin, A. [Research and Development Institute of Power Engineering, Moscow (Russian Federation)

    1995-09-01

    The scientific technical cooperation between Germany and Russia includes the area of adaptation of several German codes for the Russian-designed RBMK-reactor. One point of this cooperation is the adaptation of the Thermal-Hydraulic code ATHLET (Analyses of the Thermal-Hydraulics of LEaks and Transients), for RBMK-specific safety problems. This paper contains a short description of a RBMK-1000 reactor circuit. Furthermore, the main features of the thermal-hydraulic code ATHLET are presented. The main assumptions for the ATHLET-RBMK model are discussed. As an example for the application, the results of test calculations concerning a guillotine type rupture of a distribution group header are presented and discussed, and the general analysis conditions are described. A comparison with corresponding RELAP-calculations is given. This paper gives an overview on some problems posed and experience by application of Western best-estimate codes for RBMK-calculations.

  11. Identifying and acting on potentially inappropriate care? Inadequacy of current hospital coding for this task.

    Science.gov (United States)

    Cooper, P David; Smart, David R

    2017-06-01

    Recent Australian attempts to facilitate disinvestment in healthcare, by identifying instances of 'inappropriate' care from large Government datasets, are subject to significant methodological flaws. Amongst other criticisms has been the fact that the Government datasets utilized for this purpose correlate poorly with datasets collected by relevant professional bodies. Government data derive from official hospital coding, collected retrospectively by clerical personnel, whilst professional body data derive from unit-specific databases, collected contemporaneously with care by clinical personnel. Assessment of accuracy of official hospital coding data for hyperbaric services in a tertiary referral hospital. All official hyperbaric-relevant coding data submitted to the relevant Australian Government agencies by the Royal Hobart Hospital, Tasmania, Australia for financial year 2010-2011 were reviewed and compared against actual hyperbaric unit activity as determined by reference to original source documents. Hospital coding data contained one or more errors in diagnoses and/or procedures in 70% of patients treated with hyperbaric oxygen that year. Multiple discrete error types were identified, including (but not limited to): missing patients; missing treatments; 'additional' treatments; 'additional' patients; incorrect procedure codes and incorrect diagnostic codes. Incidental observations of errors in surgical, anaesthetic and intensive care coding within this cohort suggest that the problems are not restricted to the specialty of hyperbaric medicine alone. Publications from other centres indicate that these problems are not unique to this institution or State. Current Government datasets are irretrievably compromised and not fit for purpose. Attempting to inform the healthcare policy debate by reference to these datasets is inappropriate. Urgent clinical engagement with hospital coding departments is warranted.

  12. MatMCNP: A Code for Producing Material Cards for MCNP

    Energy Technology Data Exchange (ETDEWEB)

    DePriest, Kendall Russell [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Saavedra, Karen C. [American Structurepoint, Inc., Indianapolis, IN (United States)

    2014-09-01

    A code for generating MCNP material cards (MatMCNP) has been written and verified for naturally occurring, stable isotopes. The program allows for material specification as either atomic or weight percent (fractions). MatMCNP also permits the specification of enriched lithium, boron, and/or uranium. In addition to producing the material cards for MCNP, the code calculates the atomic (or number) density in atoms/barn-cm as well as the multiplier that should be used to convert neutron and gamma fluences into dose in the material specified.

  13. A method to determine site-specific, anisotropic fracture toughness in biological materials

    International Nuclear Information System (INIS)

    Bechtle, Sabine; Özcoban, Hüseyin; Yilmaz, Ezgi D.; Fett, Theo; Rizzi, Gabriele; Lilleodden, Erica T.; Huber, Norbert; Schreyer, Andreas; Swain, Michael V.; Schneider, Gerold A.

    2012-01-01

    Many biological materials are hierarchically structured, with highly anisotropic structures and properties on several length scales. To characterize the mechanical properties of such materials, detailed testing methods are required that allow precise and site-specific measurements on several length scales. We propose a fracture toughness measurement technique based on notched focused ion beam prepared cantilevers of lower and medium micron size scales. Using this approach, site-specific fracture toughness values in dental enamel were determined. The usefulness and challenges of the method are discussed.

  14. From Symbolic to Substantive Documents: When Business Codes of Ethics Impact Unethical Behavior in the Workplace

    OpenAIRE

    Kaptein, S.P.

    2009-01-01

    textabstractA business code of ethics is widely regarded as an important instrument to curb unethical behavior in the workplace. However, little is empirically known about the factors that determine the impact of a code on unethical behavior. Besides the existence of a code, this study proposes five determining factors: the content of the code, the frequency of communication activities surrounding the code, the quality of the communication activities, and the embedment of the code in the orga...

  15. In-core fuel management code package validation for BWRs

    International Nuclear Information System (INIS)

    1995-12-01

    The main goal of the present CRP (Coordinated Research Programme) was to develop benchmarks which are appropriate to check and improve the fuel management computer code packages and their procedures. Therefore, benchmark specifications were established which included a set of realistic data for running in-core fuel management codes. Secondly, the results of measurements and/or operating data were also provided to verify and compare with these parameters as calculated by the in-core fuel management codes or code packages. For the BWR it was established that the Mexican Laguna Verde 1 BWR would serve as the model for providing data on the benchmark specifications. It was decided to provide results for the first 2 cycles of Unit 1 of the Laguna Verde reactor. The analyses of the above benchmarks are performed in two stages. In the first stage, the lattice parameters are generated as a function of burnup at different voids and with and without control rod. These lattice parameters form the input for 3-dimensional diffusion theory codes for over-all reactor analysis. The lattice calculations were performed using different methods, such as, Monte Carlo, 2-D integral transport theory methods. Supercell Model and transport-diffusion model with proper correction for burnable absorber. Thus the variety of results should provide adequate information for any institute or organization to develop competence to analyze In-core fuel management codes. 15 refs, figs and tabs

  16. Performance testing of thermal analysis codes for nuclear fuel casks

    International Nuclear Information System (INIS)

    Sanchez, L.C.

    1987-01-01

    In 1982 Sandia National Laboratories held the First Industry/Government Joint Thermal and Structural Codes Information Exchange and presented the initial stages of an investigation of thermal analysis computer codes for use in the design of nuclear fuel shipping casks. The objective of the investigation was to (1) document publicly available computer codes, (2) assess code capabilities as determined from their user's manuals, and (3) assess code performance on cask-like model problems. Computer codes are required to handle the thermal phenomena of conduction, convection and radiation. Several of the available thermal computer codes were tested on a set of model problems to assess performance on cask-like problems. Solutions obtained with the computer codes for steady-state thermal analysis were in good agreement and the solutions for transient thermal analysis differed slightly among the computer codes due to modeling differences

  17. OSCAR-4 Code System Application to the SAFARI-1 Reactor

    International Nuclear Information System (INIS)

    Stander, Gerhardt; Prinsloo, Rian H.; Tomasevic, Djordje I.; Mueller, Erwin

    2008-01-01

    The OSCAR reactor calculation code system consists of a two-dimensional lattice code, the three-dimensional nodal core simulator code MGRAC and related service codes. The major difference between the new version of the OSCAR system, OSCAR-4, and its predecessor, OSCAR-3, is the new version of MGRAC which contains many new features and model enhancements. In this work some of the major improvements in the nodal diffusion solution method, history tracking, nuclide transmutation and cross section models are described. As part of the validation process of the OSCAR-4 code system (specifically the new MGRAC version), some of the new models are tested by comparing computational results to SAFARI-1 reactor plant data for a number of operational cycles and for varying applications. A specific application of the new features allows correct modeling of, amongst others, the movement of fuel-follower type control rods and dynamic in-core irradiation schedules. It is found that the effect of the improved control rod model, applied over multiple cycles of the SAFARI-1 reactor operation history, has a significant effect on in-cycle reactivity prediction and fuel depletion. (authors)

  18. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, M.D. [Sandia National Labs., Albuquerque, NM (United States); Khan, M.A. [IT Corp., Albuquerque, NM (United States)

    1996-04-01

    The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended.

  19. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

    International Nuclear Information System (INIS)

    Tucker, M.D.; Khan, M.A.

    1996-04-01

    The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended

  20. High-Fidelity Coding with Correlated Neurons

    Science.gov (United States)

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  1. General Monte Carlo code MONK

    International Nuclear Information System (INIS)

    Moore, J.G.

    1974-01-01

    The Monte Carlo code MONK is a general program written to provide a high degree of flexibility to the user. MONK is distinguished by its detailed representation of nuclear data in point form i.e., the cross-section is tabulated at specific energies instead of the more usual group representation. The nuclear data are unadjusted in the point form but recently the code has been modified to accept adjusted group data as used in fast and thermal reactor applications. The various geometrical handling capabilities and importance sampling techniques are described. In addition to the nuclear data aspects, the following features are also described; geometrical handling routines, tracking cycles, neutron source and output facilities. 12 references. (U.S.)

  2. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  3. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  4. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  5. Machine-Checked Sequencer for Critical Embedded Code Generator

    Science.gov (United States)

    Izerrouken, Nassima; Pantel, Marc; Thirioux, Xavier

    This paper presents the development of a correct-by-construction block sequencer for GeneAuto a qualifiable (according to DO178B/ED12B recommendation) automatic code generator. It transforms Simulink models to MISRA C code for safety critical systems. Our approach which combines classical development process and formal specification and verification using proof-assistants, led to preliminary fruitful exchanges with certification authorities. We present parts of the classical user and tools requirements and derived formal specifications, implementation and verification for the correctness and termination of the block sequencer. This sequencer has been successfully applied to real-size industrial use cases from various transportation domain partners and led to requirement errors detection and a correct-by-construction implementation.

  6. Rey: a computer code for the determination of the radionuclides activities from the gamma-ray spectrum data

    International Nuclear Information System (INIS)

    Palomares, J.; Perez, A.; Travesi, A.

    1978-01-01

    The Fortran IV computer Code, REY (REsolution and Identification), has been developed for the automatic resolution of the gamma-ray spectra from high resolution Ge-Li detectors. The Code searches the full energy peaks in the spectra background as the base line under the peak and calculates the energy of the statistically significant peaks. Also the Code assigns each peak to the most probable isotope and makes a selection of all the possible radioisotopes of the spectra, according the relative intensities of all the peaks in the whole spectra. Finally, it obtains the activities, in microcuries of each isotope, according the geometry used in the measurement. Although the Code is a general purpose one, their actual library of nuclear data is adapted for the analysis of liquid effluents from nuclear power plants. A computer with a 16 core memory and a hard disk are sufficient for this code.(author)

  7. Compilation of poultry and egg parameters for the PATHWAY code

    International Nuclear Information System (INIS)

    Ikenberry, T.A.

    1982-08-01

    The PATHWAY computer code was developed as a part of the foodchain pathway analysis task. The objective was to estimate radionuclide ingestion rates of residents of Lincoln (Nevada), Washington (Utah), and Iron (Utah) counties during the period 1951-1962, which resulted from explosion of nuclear devices at the Nevada Test Site (NTS). Estimation of radionuclide ingestion rates involves determination of radionuclide concentrations in dietary items as a function of time and geographic area, and consumption rates of such items as a function of age and lifestyle. Poultry and eggs may have been relatively significant dose contributors to humans, because of the fairly large consumption rates of these products, and because of potential radionuclide concentration in them. This paper describes nuclide-dependent and nuclide-independent parameters related to poultry products, and the determination of specific values of these parameters. 17 figs., 10 tabs

  8. Diglossia and Code Switching in Nigeria: Implications for English ...

    African Journals Online (AJOL)

    This paper discusses a sociolinguistic phenomenon, 'diglossia' and how it relates to code-switching in bilingual and multilingual societies like Nigeria. First, it examines the relevance of the social and linguistic contexts in determining the linguistic code that bilinguals and multilinguals use in various communicative ...

  9. The impact of three discharge coding methods on the accuracy of diagnostic coding and hospital reimbursement for inpatient medical care.

    Science.gov (United States)

    Tsopra, Rosy; Peckham, Daniel; Beirne, Paul; Rodger, Kirsty; Callister, Matthew; White, Helen; Jais, Jean-Philippe; Ghosh, Dipansu; Whitaker, Paul; Clifton, Ian J; Wyatt, Jeremy C

    2018-07-01

    Coding of diagnoses is important for patient care, hospital management and research. However coding accuracy is often poor and may reflect methods of coding. This study investigates the impact of three alternative coding methods on the inaccuracy of diagnosis codes and hospital reimbursement. Comparisons of coding inaccuracy were made between a list of coded diagnoses obtained by a coder using (i)the discharge summary alone, (ii)case notes and discharge summary, and (iii)discharge summary with the addition of medical input. For each method, inaccuracy was determined for the primary, secondary diagnoses, Healthcare Resource Group (HRG) and estimated hospital reimbursement. These data were then compared with a gold standard derived by a consultant and coder. 107 consecutive patient discharges were analysed. Inaccuracy of diagnosis codes was highest when a coder used the discharge summary alone, and decreased significantly when the coder used the case notes (70% vs 58% respectively, p coded from the discharge summary with medical support (70% vs 60% respectively, p coding with case notes, and 35% for coding with medical support. The three coding methods resulted in an annual estimated loss of hospital remuneration of between £1.8 M and £16.5 M. The accuracy of diagnosis codes and percentage of correct HRGs improved when coders used either case notes or medical support in addition to the discharge summary. Further emphasis needs to be placed on improving the standard of information recorded in discharge summaries. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Implementation of the SAMPO computer code in the Cyber 170-750

    International Nuclear Information System (INIS)

    Chagas, E.F.; Liguori Neto, R.; Gomes, P.R.S.

    1985-01-01

    The code SAMPO, in this available version, incorporates algorithms that determine energy, eficiency and peak shape. The code also includes processing subroutines that provide automatic surveys of peaks raising all their characteristics. The handling of the code has been improved and its analysing capacity in each region of the spectrum has been amplified. Practical information regarding the use of the code is enclosed. Tests made guarantee the good performance of the code SAMPO in the Cyber system-IEAv. (Author) [pt

  11. Development of a three-dimensional computer code for reconstructing power distributions by means of side reflector instrumentation and determination of the capabilities and limitations of this method

    International Nuclear Information System (INIS)

    Knob, P.J.

    1982-07-01

    This work is concerned with the detection of flux disturbances in pebble bed high temperature reactors by means of flux measurements in the side reflector. Included among the disturbances studied are xenon oscillations, rod group insertions, and individual rod insertions. Using the three-dimensional diffusion code CITATION, core calculations for both a very small reactor (KAHTER) and a large reactor (PNP-3000) were carried out to determine the neutron fluxes at the detector positions. These flux values were then used in flux mapping codes for reconstructing the flux distribution in the core. As an extension of the already existing two-dimensional MOFA code, which maps azimuthal disturbances, a new three-dimensional flux mapping code ZELT was developed for handling axial disturbances as well. It was found that both flux mapping programs give satisfactory results for small and large pebble bed reactors alike. (orig.) [de

  12. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  13. Context based Coding of Quantized Alpha Planes for Video Objects

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2002-01-01

    In object based video, each frame is a composition of objects that are coded separately. The composition is performed through the alpha plane that represents the transparency of the object. We present an alternative to MPEG-4 for coding of alpha planes that considers their specific properties....... Comparisons in terms of rate and distortion are provided, showing that the proposed coding scheme for still alpha planes is better than the algorithms for I-frames used in MPEG-4....

  14. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    International Nuclear Information System (INIS)

    Baratta, A.J.

    1997-01-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together

  15. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    Energy Technology Data Exchange (ETDEWEB)

    Baratta, A.J.

    1997-07-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.

  16. Comparison of the calculations of the stability properties of a specific stellarator equilibrium with different MHD stability codes

    International Nuclear Information System (INIS)

    Nakamura, Y.; Matsumoto, T.; Wakatani, M.; Ichiguchi, K.; Garcia, L.; Carreras, B.A.

    1995-04-01

    A particular configuration of the LHD stellarator with an unusually flat pressure profile has been chosen to be a test case for comparison of the MHD stability property predictions of different three-dimensional and averaged codes for the purpose of code comparison and validation. In particular, two relatively localized instabilities, the fastest growing modes with toroidal mode number n = 2 and n = 3 were studied using several different codes, with the good agreement that has been found providing justification for the use of any of them for equilibria of the type considered

  17. Validity of the Italian Code of Ethics for everyday nursing practice.

    Science.gov (United States)

    Gobbi, Paola; Castoldi, Maria Grazia; Alagna, Rosa Anna; Brunoldi, Anna; Pari, Chiara; Gallo, Annamaria; Magri, Miriam; Marioni, Lorena; Muttillo, Giovanni; Passoni, Claudia; Torre, Anna La; Rosa, Debora; Carnevale, Franco A

    2016-12-07

    The research question for this study was as follows: Is the Code of Ethics for Nurses in Italy (Code) a valid or useful decision-making instrument for nurses faced with ethical problems in their daily clinical practice? Focus groups were conducted to analyze specific ethical problems through 11 case studies. The analysis was conducted using sections of the Code as well as other relevant documents. Each focus group had a specific theme and nurses participated freely in the discussions according to their respective clinical competencies. The executive administrative committee of the local nursing licensing council provided approval for conducting this project. Measures were taken to protect the confidentiality of consenting participants. The answer to the research question posed for this investigation was predominantly positive. Many sections of the Code were useful for discussion and identifying possible solutions for the ethical problems presented in the 11 cases. We concluded that the Code of Ethics for Nurses in Italy can be a valuable aid in daily practice in most clinical situations that can give rise to ethical problems. © The Author(s) 2016.

  18. Nuclear model codes and related software distributed by the OECD/NEA Data Bank

    International Nuclear Information System (INIS)

    Sartori, E.

    1993-01-01

    Software and data for nuclear energy applications is acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article covers more specifically the availability of nuclear model codes and also those codes which further process their results into data sets needed for specific nuclear application projects. (author). 2 figs

  19. OECD International Standard Problem number 34. Falcon code comparison report

    International Nuclear Information System (INIS)

    Williams, D.A.

    1994-12-01

    ISP-34 is the first ISP to address fission product transport issues and has been strongly supported by a large number of different countries and organisations. The ISP is based on two experiments, FAL-ISP-1 and FAL-ISP-2, which were conducted in AEA's Falcon facility. Specific features of the experiments include quantification of chemical effects and aerosol behaviour. In particular, multi-component aerosol effects and vapour-aerosol interactions can all be investigated in the Falcon facility. Important parameters for participants to predict were the deposition profiles and composition, key chemical species and reactions, evolution of suspended material concentrations, and the effects of steam condensation onto aerosols and particle hygroscopicity. The results of the Falcon ISP support the belief that aerosol physics is generally well modelled in primary circuit codes, but the chemistry models in many of the codes need to be improved, since chemical speciation is one of the main factors which controls transport and deposition behaviour. The importance of chemical speciation, aerosol nucleation, and the role of multi-component aerosols in determining transport and deposition behaviour are evident. The role of re-vaporization in these Falcon experiments is not clear; it is not possible to compare those codes which predicted re-vaporization with quantitative data. The evidence from this ISP exercise indicates that the containment codes can predict thermal-hydraulics conditions satisfactorily. However, the differences in the predicted aerosol locations in the Falcon tests had shown that aerosol behaviour was very susceptible to parameters such as particle size distribution

  20. Determination of the detection efficiency of a HPGe detector by means of the MCNP 4A simulation code

    International Nuclear Information System (INIS)

    Leal, B.

    2004-01-01

    In the majority of the laboratories, the calibration in efficiency of the detector is carried out by means of the standard sources measurement of gamma photons that have a determined activity, or for matrices that contain a variety of radionuclides that can embrace the energy range of interest. Given the experimental importance that has the determination from the curves of efficiency to the effects of establishing the quantitative results, is appealed to the simulation of the response function of the detector used in the Regional Center of Nuclear Studies inside the energy range of 80 keV to 1400 keV varying the density of the matrix, by means of the application of the Monte Carlo code MCNP-4A. The adjustment obtained shows an acceptance grade in the range of 100 to 600 keV, with a smaller percentage discrepancy to 5%. (Author)

  1. Methods of evaluating the effects of coding on SAR data

    Science.gov (United States)

    Dutkiewicz, Melanie; Cumming, Ian

    1993-01-01

    It is recognized that mean square error (MSE) is not a sufficient criterion for determining the acceptability of an image reconstructed from data that has been compressed and decompressed using an encoding algorithm. In the case of Synthetic Aperture Radar (SAR) data, it is also deemed to be insufficient to display the reconstructed image (and perhaps error image) alongside the original and make a (subjective) judgment as to the quality of the reconstructed data. In this paper we suggest a number of additional evaluation criteria which we feel should be included as evaluation metrics in SAR data encoding experiments. These criteria have been specifically chosen to provide a means of ensuring that the important information in the SAR data is preserved. The paper also presents the results of an investigation into the effects of coding on SAR data fidelity when the coding is applied in (1) the signal data domain, and (2) the image domain. An analysis of the results highlights the shortcomings of the MSE criterion, and shows which of the suggested additional criterion have been found to be most important.

  2. A Perceptual Model for Sinusoidal Audio Coding Based on Spectral Integration

    NARCIS (Netherlands)

    Van de Par, S.; Kohlrausch, A.; Heusdens, R.; Jensen, J.; Holdt Jensen, S.

    2005-01-01

    Psychoacoustical models have been used extensively within audio coding applications over the past decades. Recently, parametric coding techniques have been applied to general audio and this has created the need for a psychoacoustical model that is specifically suited for sinusoidal modelling of

  3. A perceptual model for sinusoidal audio coding based on spectral integration

    NARCIS (Netherlands)

    Van de Par, S.; Kohlrauch, A.; Heusdens, R.; Jensen, J.; Jensen, S.H.

    2005-01-01

    Psychoacoustical models have been used extensively within audio coding applications over the past decades. Recently, parametric coding techniques have been applied to general audio and this has created the need for a psychoacoustical model that is specifically suited for sinusoidal modelling of

  4. Code of ethics and conduct for European nursing.

    Science.gov (United States)

    Sasso, Loredana; Stievano, Alessandro; González Jurado, Máximo; Rocco, Gennaro

    2008-11-01

    A main identifying factor of professions is professionals' willingness to comply with ethical and professional standards, often defined in a code of ethics and conduct. In a period of intense nursing mobility, if the public are aware that health professionals have committed themselves to the drawing up of a code of ethics and conduct, they will have more trust in the health professional they choose, especially if this person comes from another European Member State. The Code of Ethics and Conduct for European Nursing is a programmatic document for the nursing profession constructed by the FEPI (European Federation of Nursing Regulators) according to Directive 2005/36/EC On recognition of professional qualifications , and Directive 2006/123/EC On services in the internal market, set out by the European Commission. This article describes the construction of the Code and gives an overview of some specific areas of importance. The main text of the Code is reproduced in Appendix 1.

  5. Pre-processing of input files for the AZTRAN code

    International Nuclear Information System (INIS)

    Vargas E, S.; Ibarra, G.

    2017-09-01

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  6. Testing efficiency transfer codes for equivalence

    International Nuclear Information System (INIS)

    Vidmar, T.; Celik, N.; Cornejo Diaz, N.; Dlabac, A.; Ewa, I.O.B.; Carrazana Gonzalez, J.A.; Hult, M.; Jovanovic, S.; Lepy, M.-C.; Mihaljevic, N.; Sima, O.; Tzika, F.; Jurado Vargas, M.; Vasilopoulou, T.; Vidmar, G.

    2010-01-01

    Four general Monte Carlo codes (GEANT3, PENELOPE, MCNP and EGS4) and five dedicated packages for efficiency determination in gamma-ray spectrometry (ANGLE, DETEFF, GESPECOR, ETNA and EFFTRAN) were checked for equivalence by applying them to the calculation of efficiency transfer (ET) factors for a set of well-defined sample parameters, detector parameters and energies typically encountered in environmental radioactivity measurements. The differences between the results of the different codes never exceeded a few percent and were lower than 2% in the majority of cases.

  7. Safety, codes and standards for hydrogen installations. Metrics development and benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Aaron P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dedrick, Daniel E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); San Marchi, Christopher W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-04-01

    Automakers and fuel providers have made public commitments to commercialize light duty fuel cell electric vehicles and fueling infrastructure in select US regions beginning in 2014. The development, implementation, and advancement of meaningful codes and standards is critical to enable the effective deployment of clean and efficient fuel cell and hydrogen solutions in the energy technology marketplace. Metrics pertaining to the development and implementation of safety knowledge, codes, and standards are important to communicate progress and inform future R&D investments. This document describes the development and benchmarking of metrics specific to the development of hydrogen specific codes relevant for hydrogen refueling stations. These metrics will be most useful as the hydrogen fuel market transitions from pre-commercial to early-commercial phases. The target regions in California will serve as benchmarking case studies to quantify the success of past investments in research and development supporting safety codes and standards R&D.

  8. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  9. Rapid analysis method for the determination of 14C specific activity in irradiated graphite.

    Science.gov (United States)

    Remeikis, Vidmantas; Lagzdina, Elena; Garbaras, Andrius; Gudelis, Arūnas; Garankin, Jevgenij; Plukienė, Rita; Juodis, Laurynas; Duškesas, Grigorijus; Lingis, Danielius; Abdulajev, Vladimir; Plukis, Artūras

    2018-01-01

    14C is one of the limiting radionuclides used in the categorization of radioactive graphite waste; this categorization is crucial in selecting the appropriate graphite treatment/disposal method. We propose a rapid analysis method for 14C specific activity determination in small graphite samples in the 1-100 μg range. The method applies an oxidation procedure to the sample, which extracts 14C from the different carbonaceous matrices in a controlled manner. Because this method enables fast online measurement and 14C specific activity evaluation, it can be especially useful for characterizing 14C in irradiated graphite when dismantling graphite moderator and reflector parts, or when sorting radioactive graphite waste from decommissioned nuclear power plants. The proposed rapid method is based on graphite combustion and the subsequent measurement of both CO2 and 14C, using a commercial elemental analyser and the semiconductor detector, respectively. The method was verified using the liquid scintillation counting (LSC) technique. The uncertainty of this rapid method is within the acceptable range for radioactive waste characterization purposes. The 14C specific activity determination procedure proposed in this study takes approximately ten minutes, comparing favorably to the more complicated and time consuming LSC method. This method can be potentially used to radiologically characterize radioactive waste or used in biomedical applications when dealing with the specific activity determination of 14C in the sample.

  10. Rapid analysis method for the determination of 14C specific activity in irradiated graphite.

    Directory of Open Access Journals (Sweden)

    Vidmantas Remeikis

    Full Text Available 14C is one of the limiting radionuclides used in the categorization of radioactive graphite waste; this categorization is crucial in selecting the appropriate graphite treatment/disposal method. We propose a rapid analysis method for 14C specific activity determination in small graphite samples in the 1-100 μg range. The method applies an oxidation procedure to the sample, which extracts 14C from the different carbonaceous matrices in a controlled manner. Because this method enables fast online measurement and 14C specific activity evaluation, it can be especially useful for characterizing 14C in irradiated graphite when dismantling graphite moderator and reflector parts, or when sorting radioactive graphite waste from decommissioned nuclear power plants. The proposed rapid method is based on graphite combustion and the subsequent measurement of both CO2 and 14C, using a commercial elemental analyser and the semiconductor detector, respectively. The method was verified using the liquid scintillation counting (LSC technique. The uncertainty of this rapid method is within the acceptable range for radioactive waste characterization purposes. The 14C specific activity determination procedure proposed in this study takes approximately ten minutes, comparing favorably to the more complicated and time consuming LSC method. This method can be potentially used to radiologically characterize radioactive waste or used in biomedical applications when dealing with the specific activity determination of 14C in the sample.

  11. Benchmarking studies for the DESCARTES and CIDER codes

    International Nuclear Information System (INIS)

    Eslinger, P.W.; Ouderkirk, S.J.; Nichols, W.E.

    1993-01-01

    The Hanford Envirorunental Dose Reconstruction (HEDR) project is developing several computer codes to model the airborne release, transport, and envirormental accumulation of radionuclides resulting from Hanford operations from 1944 through 1972. In order to calculate the dose of radiation a person may have received in any given location, the geographic area addressed by the HEDR Project will be divided into a grid. The grid size suggested by the draft requirements contains 2091 units called nodes. Two of the codes being developed are DESCARTES and CIDER. The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways from the output of the air transport code RATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. The requirements that Battelle (BNW) set for these two codes were released to the HEDR Technical Steering Panel (TSP) in a draft document on November 10, 1992. This document reports on the preliminary work performed by the code development team to determine if the requirements could be met

  12. You know the Science. Do you know your Code?

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk is about automated code analysis and transformation tools to support scientific computing. Code bases are difficult to manage because of size, age, or safety requirements. Tools can help scientists and IT engineers understand their code, locate problems, improve quality. Tools can also help transform the code, by implementing complex refactorings, replatforming, or migration to a modern language. Such tools are themselves difficult to build. This talk describes DMS, a meta-tool for building software analysis tools. DMS is a kind of generalized compiler, and can be configured to process arbitrary programming languages, to carry out arbitrary analyses, and to convert specifications into running code. It has been used for a variety of purposes, including converting embedded mission software in the US B-2 Stealth Bomber, providing the US Social Security Administration with a deep view how their 200 millions lines of COBOL are connected, and reverse-engineering legacy factory process control code i...

  13. Coded Cooperation for Multiway Relaying in Wireless Sensor Networks.

    Science.gov (United States)

    Si, Zhongwei; Ma, Junyang; Thobaben, Ragnar

    2015-06-29

    Wireless sensor networks have been considered as an enabling technology for constructing smart cities. One important feature of wireless sensor networks is that the sensor nodes collaborate in some manner for communications. In this manuscript, we focus on the model of multiway relaying with full data exchange where each user wants to transmit and receive data to and from all other users in the network. We derive the capacity region for this specific model and propose a coding strategy through coset encoding. To obtain good performance with practical codes, we choose spatially-coupled LDPC (SC-LDPC) codes for the coded cooperation. In particular, for the message broadcasting from the relay, we construct multi-edge-type (MET) SC-LDPC codes by repeatedly applying coset encoding. Due to the capacity-achieving property of the SC-LDPC codes, we prove that the capacity region can theoretically be achieved by the proposed MET SC-LDPC codes. Numerical results with finite node degrees are provided, which show that the achievable rates approach the boundary of the capacity region in both binary erasure channels and additive white Gaussian channels.

  14. Training program for energy conservation in new building construction. Volume III. Energy conservation technology for plan examiners and code administrators. Energy Conservation Technology Series 200

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Under the sponsorship of the United States Department of Energy, a Model Code for Energy Conservation in New Building Construction has been developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. The subject of regulation of new building construction to assure energy conservation is recognized as one in which code officials have not had previous exposure. It was also determined that application of the model code would be made at varying levels by officials with both a specific requirement for knowledge and a differing degree of prior training in the state-of-the-art. Therefore, a training program and instructional materials were developed for code officials to assist them in the implementation and enforcement of energy efficient standards and codes. The training program for Energy Conservation Tehnology for Plan Examiners and Code Administrators (ECT Series 200) is presented.

  15. Electrical safety code manual a plan language guide to national electrical code, OSHA and NFPA 70E

    CERN Document Server

    Keller, Kimberley

    2010-01-01

    Safety in any workplace is extremely important. In the case of the electrical industry, safety is critical and the codes and regulations which determine safe practices are both diverse and complicated. Employers, electricians, electrical system designers, inspectors, engineers and architects must comply with safety standards listed in the National Electrical Code, OSHA and NFPA 70E. Unfortunately, the publications which list these safety requirements are written in very technically advanced terms and the average person has an extremely difficult time understanding exactly what they need to

  16. Production of analysis code for 'JOYO' dosimetry experiment

    International Nuclear Information System (INIS)

    Sasaki, Makoto; Nakazawa, Masaharu.

    1981-01-01

    As part of the measurement and analysis plan for the Dosimetry Experiment at the ''JOYO'' experimental fast reactor, neutron flux spectra analysis is performed using the NEUPAC (Neutron Unfolding Code Package) computer program. The code calculates the neutron flux spectra and other integral quantities from the activation data of the dosimeter foils. The NEUPAC code is based on the J1-type unfolding method, and the estimated neutron flux spectra is obtained as its solution. The program is able to determine the integral quantities and their sensitivities, together with an error estimate of the unfolded spectra and integral quantities. The code also performs a chi-square test of the input/output data, and contains many options for the calculational routines. This report presents the analytic theory, the program algorithms, and a description of the functions and use of the NEUPAC code. (author)

  17. Synchronous spikes are necessary but not sufficient for a synchrony code in populations of spiking neurons.

    Science.gov (United States)

    Grewe, Jan; Kruscha, Alexandra; Lindner, Benjamin; Benda, Jan

    2017-03-07

    Synchronous activity in populations of neurons potentially encodes special stimulus features. Selective readout of either synchronous or asynchronous activity allows formation of two streams of information processing. Theoretical work predicts that such a synchrony code is a fundamental feature of populations of spiking neurons if they operate in specific noise and stimulus regimes. Here we experimentally test the theoretical predictions by quantifying and comparing neuronal response properties in tuberous and ampullary electroreceptor afferents of the weakly electric fish Apteronotus leptorhynchus These related systems show similar levels of synchronous activity, but only in the more irregularly firing tuberous afferents a synchrony code is established, whereas in the more regularly firing ampullary afferents it is not. The mere existence of synchronous activity is thus not sufficient for a synchrony code. Single-cell features such as the irregularity of spiking and the frequency dependence of the neuron's transfer function determine whether synchronous spikes possess a distinct meaning for the encoding of time-dependent signals.

  18. On locality of Generalized Reed-Muller codes over the broadcast erasure channel

    KAUST Repository

    Alloum, Amira; Lin, Sian Jheng; Al-Naffouri, Tareq Y.

    2016-01-01

    , and more specifically at the application layer where Rateless, LDPC, Reed Slomon codes and network coding schemes have been extensively studied, optimized and standardized in the past. Beyond reusing, extending or adapting existing application layer packet

  19. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien; Ruehl, Kelley; Roy, Andre; Costello, Ronan; Laporte Weywada, Pauline; Bailey, Helen

    2017-01-01

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to model hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.

  20. Coding of obesity in administrative hospital discharge abstract data: accuracy and impact for future research studies.

    Science.gov (United States)

    Martin, Billie-Jean; Chen, Guanmin; Graham, Michelle; Quan, Hude

    2014-02-13

    Obesity is a pervasive problem and a popular subject of academic assessment. The ability to take advantage of existing data, such as administrative databases, to study obesity is appealing. The objective of our study was to assess the validity of obesity coding in an administrative database and compare the association between obesity and outcomes in an administrative database versus registry. This study was conducted using a coronary catheterization registry and an administrative database (Discharge Abstract Database (DAD)). A Body Mass Index (BMI) ≥30 kg/m2 within the registry defined obesity. In the DAD obesity was defined by diagnosis codes E65-E68 (ICD-10). The sensitivity, specificity, negative predictive value (NPV) and positive predictive value (PPV) of an obesity diagnosis in the DAD was determined using obesity diagnosis in the registry as the referent. The association between obesity and outcomes was assessed. The study population of 17380 subjects was largely male (68.8%) with a mean BMI of 27.0 kg/m2. Obesity prevalence was lower in the DAD than registry (2.4% vs. 20.3%). A diagnosis of obesity in the DAD had a sensitivity 7.75%, specificity 98.98%, NPV 80.84% and PPV 65.94%. Obesity was associated with decreased risk of death or re-hospitalization, though non-significantly within the DAD. Obesity was significantly associated with an increased risk of cardiac procedure in both databases. Overall, obesity was poorly coded in the DAD. However, when coded, it was coded accurately. Administrative databases are not an optimal datasource for obesity prevalence and incidence surveillance but could be used to define obese cohorts for follow-up.

  1. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-03-13

    In this work we have developed a restructuring software tool (RT-CUDA) following the proposed optimization specifications to bridge the gap between high-level languages and the machine dependent CUDA environment. RT-CUDA takes a C program and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design of linear algebra solvers. We expect RT-CUDA to be needed by many KSA industries dealing with science and engineering simulation on massively parallel computers like NVIDIA GPUs.

  2. Adaptive format conversion for scalable video coding

    Science.gov (United States)

    Wan, Wade K.; Lim, Jae S.

    2001-12-01

    The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.

  3. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs

    Directory of Open Access Journals (Sweden)

    Yu-yan Yu

    2018-04-01

    Full Text Available A novel and sensitive assay for aflatoxin B1 (AFB1 detection has been developed by using bio-bar code assay (BCA. The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP and monoclonal antibodies modified magnetic microparticle (MMP, and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10−8 ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Keywords: Aflatoxin B1, Bio-bar code assay, Chinese herbs, Magnetic microparticle probes, Nanoparticle probes

  4. The 2002 Revision of the American Psychological Association's Ethics Code: Implications for School Psychologists

    Science.gov (United States)

    Flanagan, Rosemary; Miller, Jeffrey A.; Jacob, Susan

    2005-01-01

    The Ethical Principles for Psychologists and Code of Conduct has been recently revised. The organization of the code changed, and the language was made more specific. A number of points relevant to school psychology are explicitly stated in the code. A clear advantage of including these items in the code is the assistance to school psychologists…

  5. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    International Nuclear Information System (INIS)

    Byamukama, Abdul; Jung, Haiyong

    2014-01-01

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries

  6. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Byamukama, Abdul [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Haiyong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries.

  7. A symbiotic liaison between the genetic and epigenetic code

    Directory of Open Access Journals (Sweden)

    Holger eHeyn

    2014-05-01

    Full Text Available With rapid advances in sequencing technologies, we are undergoing a paradigm shift from hypothesis- to data-driven research. Genome-wide profiling efforts gave informative insights into biological processes; however, considering the wealth of variation, the major challenge remains their meaningful interpretation. In particular sequence variation in non-coding contexts is often challenging to interpret. Here, data integration approaches for the identification of functional genetic variability represent a likely solution. Exemplary, functional linkage analysis integrating genotype and expression data determined regulatory quantitative trait loci (QTL and proposed causal relationships. In addition to gene expression, epigenetic regulation and specifically DNA methylation was established as highly valuable surrogate mark for functional variance of the genetic code. Epigenetic modification served as powerful mediator trait to elucidate mechanisms forming phenotypes in health and disease. Particularly, integrative studies of genetic and DNA methylation data yet guided interpretation strategies of risk genotypes, but also proved their value for physiological traits, such as natural human variation and aging. This Perspective seeks to illustrate the power of data integration in the genomic era exemplified by DNA methylation quantitative trait loci (meQTLs. However, the model is further extendable to virtually all traceable molecular traits.

  8. Converter of a continuous code into the Grey code

    International Nuclear Information System (INIS)

    Gonchar, A.I.; TrUbnikov, V.R.

    1979-01-01

    Described is a converter of a continuous code into the Grey code used in a 12-charged precision amplitude-to-digital converter to decrease the digital component of spectrometer differential nonlinearity to +0.7% in the 98% range of the measured band. To construct the converter of a continuous code corresponding to the input signal amplitude into the Grey code used is the regularity in recycling of units and zeroes in each discharge of the Grey code in the case of a continuous change of the number of pulses of a continuous code. The converter is constructed on the elements of 155 series, the frequency of continuous code pulse passing at the converter input is 25 MHz

  9. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.

    Science.gov (United States)

    Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying

    2018-04-01

    A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8  ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.

  10. Adapting the coping in deliberation (CODE) framework: a multi-method approach in the context of familial ovarian cancer risk management.

    Science.gov (United States)

    Witt, Jana; Elwyn, Glyn; Wood, Fiona; Rogers, Mark T; Menon, Usha; Brain, Kate

    2014-11-01

    To test whether the coping in deliberation (CODE) framework can be adapted to a specific preference-sensitive medical decision: risk-reducing bilateral salpingo-oophorectomy (RRSO) in women at increased risk of ovarian cancer. We performed a systematic literature search to identify issues important to women during deliberations about RRSO. Three focus groups with patients (most were pre-menopausal and untested for genetic mutations) and 11 interviews with health professionals were conducted to determine which issues mattered in the UK context. Data were used to adapt the generic CODE framework. The literature search yielded 49 relevant studies, which highlighted various issues and coping options important during deliberations, including mutation status, risks of surgery, family obligations, physician recommendation, peer support and reliable information sources. Consultations with UK stakeholders confirmed most of these factors as pertinent influences on deliberations. Questions in the generic framework were adapted to reflect the issues and coping options identified. The generic CODE framework was readily adapted to a specific preference-sensitive medical decision, showing that deliberations and coping are linked during deliberations about RRSO. Adapted versions of the CODE framework may be used to develop tailored decision support methods and materials in order to improve patient-centred care. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Manual for IRS Coding. Joint IAEA/NEA International Reporting System for Operating Experience

    International Nuclear Information System (INIS)

    2011-01-01

    The International Reporting System for Operating Experience (IRS) is jointly operated by the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA). In early 2010, the IAEA and OECD/NEA jointly issued the IRS Guidelines, which described the reporting system and process and gave users the necessary elements to enable them to produce IRS reports to a high standard of quality while retaining the effectiveness of the system expected by all Member States operating nuclear power plants. The purpose of the present Manual for IRS Coding is to provide supplementary guidance specifically on the coding element of IRS reports to ensure uniform coding of events that are reported through IRS. This Coding Manual does not supersede the IRS Guidelines, but rather, supports users and preparers in achieving a consistent and high level of quality in their IRS reports. Consistency and high quality in the IRS reports allow stakeholders to search and retrieve specific event information with ease. In addition, well-structured reports also enhance the efficient management of the IRS database. This Coding Manual will give specific guidance on the application of each section of the IRS codes, with examples where necessary, of when and how these codes are to be applied. As this reporting system is owned by the Member States, this manual has been developed and approved by the IRS National Coordinators with the assistance of the IAEA and NEA secretariats

  12. Governing sexual behaviour through humanitarian codes of conduct.

    Science.gov (United States)

    Matti, Stephanie

    2015-10-01

    Since 2001, there has been a growing consensus that sexual exploitation and abuse of intended beneficiaries by humanitarian workers is a real and widespread problem that requires governance. Codes of conduct have been promoted as a key mechanism for governing the sexual behaviour of humanitarian workers and, ultimately, preventing sexual exploitation and abuse (PSEA). This article presents a systematic study of PSEA codes of conduct adopted by humanitarian non-governmental organisations (NGOs) and how they govern the sexual behaviour of humanitarian workers. It draws on Foucault's analytics of governance and speech act theory to examine the findings of a survey of references to codes of conduct made on the websites of 100 humanitarian NGOs, and to analyse some features of the organisation-specific PSEA codes identified. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  13. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  14. Disadvantages of legal engineering of the disclosure of certain norms of the Budget Code of Ukraine

    Directory of Open Access Journals (Sweden)

    Світлана Миколаївна Клімова

    2017-09-01

    the Budget Code of Ukraine states as follows: “The total amount of financial resources for each type of intergovernmental transfers specified in paragraphs 6-8 of part one of Article 97 of this Code shall be calculated on the basis of state social standards and norms established by law and Other normative-legal acts “. Determine the ways (techniques of presenting this rule of law: – by the degree of abstract presentation – casuistic; – by degree of completeness of presentation – blanket; – in the form of the proposal – in the form of narrative proposal. This clause of Article 94 of the BCU can not be implemented by public administration bodies without the adoption of another, special law or by-law normative-legal act. Secondly, the integrity of the public finance management process is impossible without systematic and qualitative control. Article 113 of the BCU determines the powers of the state financial control bodies to control compliance with budget legislation. The control functions of the public administration bodies of Ukraine should be stated as follows: 1 The Budget Code of Ukraine does not specify specific directions and objects of control, but the system, principles and tasks of public financial control (in Article 113; 2 determine, in a special law, the system, forms, methods, types of public financial control; 3 by subordinate acts to establish specific rules for carrying out control activities by a specific public administration body in relation to a particular object. Thirdly, the norms of the Budget Code of Ukraine on legal liability for violation of budget legislation are set out in a non-systematic way in violation of the rules of legal technique. Conclusions of the research. 1. The legal electronics industry covers knowledge of the rules of law and the creation of a process of various legal documents outlining the content written regulations, preparing legal documents, legal analysis of the expression of legal rights and duties of

  15. Edge-preserving Intra Depth Coding based on Context-coding and H.264/AVC

    DEFF Research Database (Denmark)

    Zamarin, Marco; Salmistraro, Matteo; Forchhammer, Søren

    2013-01-01

    Depth map coding plays a crucial role in 3D Video communication systems based on the “Multi-view Video plus Depth” representation as view synthesis performance is strongly affected by the accuracy of depth information, especially at edges in the depth map image. In this paper an efficient algorithm...... for edge-preserving intra depth compression based on H.264/AVC is presented. The proposed method introduces a new Intra mode specifically targeted to depth macroblocks with arbitrarily shaped edges, which are typically not efficiently represented by DCT. Edge macroblocks are partitioned into two regions...... each approximated by a flat surface. Edge information is encoded by means of contextcoding with an adaptive template. As a novel element, the proposed method allows exploiting the edge structure of previously encoded edge macroblocks during the context-coding step to further increase compression...

  16. Regional and temporal variations in coding of hospital diagnoses referring to upper gastrointestinal and oesophageal bleeding in Germany

    Directory of Open Access Journals (Sweden)

    Garbe Edeltraut

    2011-08-01

    Full Text Available Abstract Background Health insurance claims data are increasingly used for health services research in Germany. Hospital diagnoses in these data are coded according to the International Classification of Diseases, German modification (ICD-10-GM. Due to the historical division into West and East Germany, different coding practices might persist in both former parts. Additionally, the introduction of Diagnosis Related Groups (DRGs in Germany in 2003/2004 might have changed the coding. The aim of this study was to investigate regional and temporal variations in coding of hospitalisation diagnoses in Germany. Methods We analysed hospitalisation diagnoses for oesophageal bleeding (OB and upper gastrointestinal bleeding (UGIB from the official German Hospital Statistics provided by the Federal Statistical Office. Bleeding diagnoses were classified as "specific" (origin of bleeding provided or "unspecific" (origin of bleeding not provided coding. We studied regional (former East versus West Germany differences in incidence of hospitalisations with specific or unspecific coding for OB and UGIB and temporal variations between 2000 and 2005. For each year, incidence ratios of hospitalisations for former East versus West Germany were estimated with log-linear regression models adjusting for age, gender and population density. Results Significant differences in specific and unspecific coding between East and West Germany and over time were found for both, OB and UGIB hospitalisation diagnoses, respectively. For example in 2002, incidence ratios of hospitalisations for East versus West Germany were 1.24 (95% CI 1.16-1.32 for specific and 0.67 (95% CI 0.60-0.74 for unspecific OB diagnoses and 1.43 (95% CI 1.36-1.51 for specific and 0.83 (95% CI 0.80-0.87 for unspecific UGIB. Regional differences nearly disappeared and time trends were less marked when using combined specific and unspecific diagnoses of OB or UGIB, respectively. Conclusions During the study

  17. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers),

  18. Preliminary design studies for the DESCARTES and CIDER codes

    International Nuclear Information System (INIS)

    Eslinger, P.W.; Miley, T.B.; Ouderkirk, S.J.; Nichols, W.E.

    1992-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) project is developing several computer codes to model the release and transport of radionuclides into the environment. This preliminary design addresses two of these codes: Dynamic Estimates of Concentrations and Radionuclides in Terrestrial Environments (DESCARTES) and Calculation of Individual Doses from Environmental Radionuclides (CIDER). The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways, given the output of the air transport code HATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. This document reports on preliminary design work performed by the code development team to determine if the requirements could be met for Descartes and CIDER. The document contains three major sections: (i) a data flow diagram and discussion for DESCARTES, (ii) a data flow diagram and discussion for CIDER, and (iii) a series of brief statements regarding the design approach required to address each code requirement

  19. A radiological characterization extension for the DORIAN code - Summer Student Report

    CERN Document Server

    van Hoorn, Isabelle

    2016-01-01

    During my stay at CERN as a summer student I was working in the Radiation Protection group. The primary task of my project was to expand the functionality of the DORIAN code that is used for the prediction and analysis of residual dose rates due to accelerator radiation induced activation. With the guidance of my supervisor I extended the framework of the DORIAN code to include a radiological classification scheme that is able to compute mass specific activities for a given irradiation profile and cool-down time and compare these specific activities to given waste characterization limit sets . Additionally, the DORIAN code extension can compute the cool-down time required to stay within a certain limit set threshold for a fixed irradiation profile

  20. Recent progress on weight distributions of cyclic codes over finite fields

    Directory of Open Access Journals (Sweden)

    Hai Q. Dinh

    2015-01-01

    Full Text Available Cyclic codes are an interesting type of linear codes and have wide applications in communication and storage systems due to their efficient encoding and decoding algorithms. In coding theory it is often desirable to know the weight distribution of a cyclic code to estimate the error correcting capability and error probability. In this paper, we present the recent progress on the weight distributions of cyclic codes over finite fields, which had been determined by exponential sums. The cyclic codes with few weights which are very useful are discussed and their existence conditions are listed. Furthermore, we discuss the more general case of constacyclic codes and give some equivalences to characterize their weight distributions.

  1. Amino-terminal domain of classic cadherins determines the specificity of the adhesive interactions

    DEFF Research Database (Denmark)

    Klingelhöfer, Jörg; Troyanovsky, R B; Laur, O Y

    2000-01-01

    Classic cadherins are transmembrane receptors involved in cell type-specific calcium-dependent intercellular adhesion. The specificity of adhesion is mediated by homophilic interactions between cadherins extending from opposing cell surfaces. In addition, classic cadherins can self-associate form......Classic cadherins are transmembrane receptors involved in cell type-specific calcium-dependent intercellular adhesion. The specificity of adhesion is mediated by homophilic interactions between cadherins extending from opposing cell surfaces. In addition, classic cadherins can self....... To study lateral and adhesive intercadherin interactions, we examined interactions between two classic cadherins, E- and P-cadherins, in epithelial A-431 cells co-producing both proteins. We showed that these cells exhibited heterocomplexes consisting of laterally assembled E- and P....... The specificity of adhesive interaction was localized to the amino-terminal (EC1) domain of both cadherins. Thus, EC1 domain of classic cadherins exposes two determinants responsible for nonspecific lateral and cadherin type-specific adhesive dimerization....

  2. Coding Transparency in Object-Based Video

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    A novel algorithm for coding gray level alpha planes in object-based video is presented. The scheme is based on segmentation in multiple layers. Different coders are specifically designed for each layer. In order to reduce the bit rate, cross-layer redundancies as well as temporal correlation are...

  3. Proteins labelling with 125I and experimental determination of their specific activity

    International Nuclear Information System (INIS)

    Caro, R.A.; Ciscato, V.A.; Giacomini, S.M.V. de; Quiroga, S.; Radicella, R.

    1975-11-01

    A standardization of the labelling technique of proteins with 125 I and the control of the obtained products, principally their specific activities was performed, in order to utilize them correctly in radioimmunoassays. The quantities of chloramine-T and sodium metabisulphite were lowered, with regard to the original method, to 3.6 and 9.6 μg respectively. Under these conditions, optimal yields and radioiodinated proteins with good immunological activities were obtained. It was found that the specific activity calculated, as usual, from the yield obtained by electrophoresis, is higher than the real value. For these reasons the yields and the corresponding specific activities were determined from ascending chromatographies performed with 70 per cent methanol as solvent, during two hours in darkness. The radioimmunoassay displacement curves obtained with proteins labelled which the proposed method and the specific activities of which were calculated from their radiochromatographic patterns, were reproducible and gave a percentage of bound radioiodinated protein in the absence of cold protein of 50 +- 4. (author) [es

  4. Changes in the Coding and Non-coding Transcriptome and DNA Methylome that Define the Schwann Cell Repair Phenotype after Nerve Injury.

    Science.gov (United States)

    Arthur-Farraj, Peter J; Morgan, Claire C; Adamowicz, Martyna; Gomez-Sanchez, Jose A; Fazal, Shaline V; Beucher, Anthony; Razzaghi, Bonnie; Mirsky, Rhona; Jessen, Kristjan R; Aitman, Timothy J

    2017-09-12

    Repair Schwann cells play a critical role in orchestrating nerve repair after injury, but the cellular and molecular processes that generate them are poorly understood. Here, we perform a combined whole-genome, coding and non-coding RNA and CpG methylation study following nerve injury. We show that genes involved in the epithelial-mesenchymal transition are enriched in repair cells, and we identify several long non-coding RNAs in Schwann cells. We demonstrate that the AP-1 transcription factor C-JUN regulates the expression of certain micro RNAs in repair Schwann cells, in particular miR-21 and miR-34. Surprisingly, unlike during development, changes in CpG methylation are limited in injury, restricted to specific locations, such as enhancer regions of Schwann cell-specific genes (e.g., Nedd4l), and close to local enrichment of AP-1 motifs. These genetic and epigenomic changes broaden our mechanistic understanding of the formation of repair Schwann cell during peripheral nervous system tissue repair. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Sex-specific determinants of fitness in a social mammal.

    Science.gov (United States)

    Lardy, Sophie; Allainé, Dominique; Bonenfant, Christophe; Cohas, Aurélie

    2015-11-01

    Sociality should evolve when the fitness benefits of group living outweigh the costs. Theoretical models predict an optimal group size maximizing individual fitness. However, beyond the number of individuals present in a group, the characteristics of these individuals, like their sex, are likely to affect the fitness payoffs of group living. Using 20 years of individually based data on a social mammal, the Alpine marmot (Marmota marmota), we tested for the occurrence of an optimal group size and composition, and for sex-specific effects of group characteristics on fitness. Based on lifetime data of 52 males and 39 females, our findings support the existence of an optimal group size maximizing male fitness and an optimal group composition maximizing fitness of males and females. Additionally, although group characteristics (i.e., size, composition and instability) affecting male and female fitness differed, fitness depended strongly on the number of same-sex subordinates within the social group in the two sexes. By comparing multiple measures of social group characteristics and of fitness in both sexes, we highlighted the sex-specific determinants of fitness in the two sexes and revealed the crucial role of intrasexual competition in shaping social group composition.

  6. Lung volumes: measurement, clinical use, and coding.

    Science.gov (United States)

    Flesch, Judd D; Dine, C Jessica

    2012-08-01

    Measurement of lung volumes is an integral part of complete pulmonary function testing. Some lung volumes can be measured during spirometry; however, measurement of the residual volume (RV), functional residual capacity (FRC), and total lung capacity (TLC) requires special techniques. FRC is typically measured by one of three methods. Body plethysmography uses Boyle's Law to determine lung volumes, whereas inert gas dilution and nitrogen washout use dilution properties of gases. After determination of FRC, expiratory reserve volume and inspiratory vital capacity are measured, which allows the calculation of the RV and TLC. Lung volumes are commonly used for the diagnosis of restriction. In obstructive lung disease, they are used to assess for hyperinflation. Changes in lung volumes can also be seen in a number of other clinical conditions. Reimbursement for measurement of lung volumes requires knowledge of current procedural terminology (CPT) codes, relevant indications, and an appropriate level of physician supervision. Because of recent efforts to eliminate payment inefficiencies, the 10 previous CPT codes for lung volumes, airway resistance, and diffusing capacity have been bundled into four new CPT codes.

  7. Discovery of Proteomic Code with mRNA Assisted Protein Folding

    Directory of Open Access Journals (Sweden)

    Jan C. Biro

    2008-12-01

    Full Text Available The 3x redundancy of the Genetic Code is usually explained as a necessity to increase the mutation-resistance of the genetic information. However recent bioinformatical observations indicate that the redundant Genetic Code contains more biological information than previously known and which is additional to the 64/20 definition of amino acids. It might define the physico-chemical and structural properties of amino acids, the codon boundaries, the amino acid co-locations (interactions in the coded proteins and the free folding energy of mRNAs. This additional information, which seems to be necessary to determine the 3D structure of coding nucleic acids as well as the coded proteins, is known as the Proteomic Code and mRNA Assisted Protein Folding.

  8. Codes of conduct: An extra suave instrument of EU governance?

    DEFF Research Database (Denmark)

    Borras, Susana

    able to coordinate actors successfully (effectiveness)? and secondly, under what conditions are codes of conduct able to generate democratically legitimate political processes? The paper examines carefully a recent case study, the “Code of Conduct for the Recruitment of Researchers” (CCRR). The code...... establishes a specific set of voluntary norms and principles that shall guide the recruiting process of researchers by European research organizations (universities, public research organizations and firms) in the 33 countries of the single market minded initiative of the European Research Area. A series...

  9. The enhanced variance propagation code for the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Kern, E.A.; Zack, N.R.; Britschgi, J.J.

    1992-01-01

    The Variance Propagation (VP) Code was developed by the Los Alamos National Laboratory's Safeguard's Systems Group to provide off-line variance propagation and systems analysis for nuclear material processing facilities. The code can also be used as a tool in the design and evaluation of material accounting systems. In this regard , the VP code was enhanced to incorporate a model of the material accountability measurements used in the Idaho Chemical Processing Plant operated by the Westinghouse Idaho Nuclear Company. Inputs to the code were structured to account for the dissolves/headend process, the waste streams, process performed to determine the sensitivity of measurement and sampling errors to the overall material balance error. We determined that the material balance error is very sensitive to changes in the sampling errors. 3 refs

  10. Applying Physical-Layer Network Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Liew SoungChang

    2010-01-01

    Full Text Available A main distinguishing feature of a wireless network compared with a wired network is its broadcast nature, in which the signal transmitted by a node may reach several other nodes, and a node may receive signals from several other nodes, simultaneously. Rather than a blessing, this feature is treated more as an interference-inducing nuisance in most wireless networks today (e.g., IEEE 802.11. This paper shows that the concept of network coding can be applied at the physical layer to turn the broadcast property into a capacity-boosting advantage in wireless ad hoc networks. Specifically, we propose a physical-layer network coding (PNC scheme to coordinate transmissions among nodes. In contrast to "straightforward" network coding which performs coding arithmetic on digital bit streams after they have been received, PNC makes use of the additive nature of simultaneously arriving electromagnetic (EM waves for equivalent coding operation. And in doing so, PNC can potentially achieve 100% and 50% throughput increases compared with traditional transmission and straightforward network coding, respectively, in 1D regular linear networks with multiple random flows. The throughput improvements are even larger in 2D regular networks: 200% and 100%, respectively.

  11. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  12. Evaluation of risk effective STIs with specific application to diesels

    International Nuclear Information System (INIS)

    Vesely, W.E.; Samanta, P.K.; Ginzburg, T.

    1987-01-01

    From a risk standpoint, the objective of surveillance tests is to control the risk arising from failures which can occur while the component is on standby. At the same time, risks caused by the test from test-caused failures and test-caused degradations need also to be controlled. Risk-acceptable test intervals balance these risks in an attempt to achieve an acceptable low, overall risk. Risk and reliability approaches are presented which allow risk-acceptable test intervals to be determined for any component. To provide focus for the approaches, diesels are specifically evaluated, however, the approaches can be applied not only to diesels, but to any component with suitable data. Incorporation of the approaches in personal computer (PC) software is discussed, which can provide tools for the regulator or plant personnel for determining acceptable diesel test intervals for any plant specific or generic application. The FRANTIC III computer code was run to validate the approaches and to evaluate specific issues associated with determining risk effective test intervals for diesels. Using the approaches presented, diesel accident unavailability can be more effectively monitored and be controlled on a plant-specific or generic basis. Test intervals can be made more risk effective than they are now, producing more acceptable accident unavailabilities. The methods presented are one step toward performance-based technical specifications, which more directly control risks

  13. Accelerator-driven transmutation reactor analysis code system (ATRAS)

    Energy Technology Data Exchange (ETDEWEB)

    Sasa, Toshinobu; Tsujimoto, Kazufumi; Takizuka, Takakazu; Takano, Hideki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-03-01

    JAERI is proceeding a design study of the hybrid type minor actinide transmutation system which mainly consist of an intense proton accelerator and a fast subcritical core. Neutronics and burnup characteristics of the accelerator-driven system is important from a view point of the maintenance of subcriticality and energy balance during the system operation. To determine those characteristics accurately, it is necessary to involve reactions at high-energy region, which are not treated on ordinary reactor analysis codes. The authors developed a code system named ATRAS to analyze the neutronics and burnup characteristics of accelerator-driven subcritical reactor systems. ATRAS has a function of burnup analysis taking account of the effect of spallation neutron source. ATRAS consists of a spallation analysis code, a neutron transport codes and a burnup analysis code. Utility programs for fuel exchange, pre-processing and post-processing are also incorporated. (author)

  14. Tissue-specific RNA expression marks distant-acting developmental enhancers.

    Directory of Open Access Journals (Sweden)

    Han Wu

    2014-09-01

    Full Text Available Short non-coding transcripts can be transcribed from distant-acting transcriptional enhancer loci, but the prevalence of such enhancer RNAs (eRNAs within the transcriptome, and the association of eRNA expression with tissue-specific enhancer activity in vivo remain poorly understood. Here, we investigated the expression dynamics of tissue-specific non-coding RNAs in embryonic mouse tissues via deep RNA sequencing. Overall, approximately 80% of validated in vivo enhancers show tissue-specific RNA expression that correlates with tissue-specific enhancer activity. Globally, we identified thousands of tissue-specifically transcribed non-coding regions (TSTRs displaying various genomic hallmarks of bona fide enhancers. In transgenic mouse reporter assays, over half of tested TSTRs functioned as enhancers with reproducible activity in the predicted tissue. Together, our results demonstrate that tissue-specific eRNA expression is a common feature of in vivo enhancers, as well as a major source of extragenic transcription, and that eRNA expression signatures can be used to predict tissue-specific enhancers independent of known epigenomic enhancer marks.

  15. Age versus size determination of radial variation in wood specific gravity : lessons from eccentrics

    Science.gov (United States)

    G. Bruce Williamson; Michael C. Wiemann

    2011-01-01

    Radial increases in wood specific gravity have been shown to characterize early successional trees from tropical forests. Here, we develop and apply a novel method to test whether radial increases are determined by tree age or tree size. The method compares the slopes of specific gravity changes across a short radius and a long radius of trees with eccentric trunks. If...

  16. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  17. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  18. Firm-specific, and institutional determinants of corporate investments in Nigeria

    Directory of Open Access Journals (Sweden)

    Folorunsho M. Ajide

    2017-12-01

    Full Text Available We examined the effect of institutional quality and firm-specific factors on corporate investment in Nigeria using fifty-four (54 quoted non-financial firms within the period of 2002–2012. We applied dynamic panel estimator proposed by Arellano–Bond (1991. The results showed that regulatory quality, corruption, political stability and control of corruption have insignificant effect in determining corporate investments in Nigeria. Our results also confirmed that firms’ firm-specific factors influenced corporate investment in Nigeria. While firms’ cash flow displayed positive and significant effect on investment other factors had negative effects on investment.Our results showed that investment is constrained to internally generated fund, despite the existence of capital market. In addition, the spillover effect of tightening monetary policy during the period of study had increased the cost of borrowing thereby having a negative effect on investment in the real sector. We recommended that when the monetary authorities are focusing on inflation targeting, they should also not lose sight of its impact on corporate investment and other productivity growth of firms; which is the source of long term sustainable growth and development of economies. Keywords: Institution, Nigeria, GMM, Firm-specific, Investment

  19. Contributions to the validation of the ASTEC V1 code

    International Nuclear Information System (INIS)

    Constantin, Marin; Rizoiu, Andrei; Turcu, Ilie

    2004-01-01

    In the frame of PHEBEN2 project (Validation of the severe accidents codes for applications to nuclear power plants, based on the PHEBUS FP experiments), a project developed within the EU research Frame Program 5 (FP5), the INR-Pitesti's team has received the task of determining the ASTEC code sensitivity. The PHEBEN2 project has been initiated in 1998 and gathered 13 partners from 6 EU member states. To the project 4 partners from 3 candidate states (Hungary, Bulgaria and Romania) joined later. The works were contracted with the European Commission (under FIKS-CT1999-00009 contract) that supports financially the research effort up to about 50%. According to the contract provisions, INR's team participated in developing the Working Package 1 (WP1) which refers to validation of the integral computation codes that use the PHOEBUS experimental data and the Working Package 3 (WP3) referring to the evaluation of the codes to be applied in nuclear power plants for risk evaluation, nuclear safety margin evaluation and determination/evaluation of the measures to be adopted in case of severe accident. The present work continues the efforts to validate preliminarily the ASTEC code. Focused are the the stand-alone sensitivity analyses applied to two most important modules of the code, namely DIVA and SOPHAEROS

  20. User's manual for the G.T.M.-1 computer code

    International Nuclear Information System (INIS)

    Prado-Herrero, P.

    1992-01-01

    This document describes the GTM-1 ( Geosphere Transport Model, release-1) computer code and is intended to provide the reader with enough detailed information in order to use the code. GTM-1 was developed for the assessment of radionuclide migration by the ground water through geologic deposits whose properties can change along the pathway.GTM-1 solves the transport equation by the finite differences method ( Crank-Nicolson scheme ). It was developped for specific use within Probabilistic System Assessment (PSA) Monte Carlo Method codes; in this context the first application of GTM-1 was within the LISA (Long Term Isolation System Assessment) code. GTM-1 is also available as an independent model, which includes various submodels simulating a multi-barrier disposal system. The code has been tested with the PSACOIN ( Probabilistic System Assessment Codes intercomparison) benchmarks exercises from PSAC User Group (OECD/NEA). 10 refs., 6 Annex., 2 tabs