WorldWideScience

Sample records for genon concept coding

  1. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  2. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  3. A computer code for Tokamak reactor concepts evaluation

    International Nuclear Information System (INIS)

    Rosatelli, F.; Raia, G.

    1985-01-01

    A computer package has been developed which could preliminarily investigate the engineering configuration of a tokamak reactor concept. The code is essentially intended to synthesize, starting from a set of geometrical and plasma physics parameters and the required performances and objectives, three fundamental components of a tokamak reactor core: blanket+shield, TF magnet, PF magnet. An iterative evaluation of the size, power supply and cooling system requirements of these components allows the judgment and the preliminary design optimization on the considered reactor concept. The versatility of the code allows its application both to next generation tokamak devices and power reactor concepts

  4. Development of FBR integrity system code. Basic concept

    International Nuclear Information System (INIS)

    Asayama, Tai

    2001-05-01

    For fast breeder reactors to be commercialized, they must be more reliable, safer, and at the same, economically competitive with future light water reactors. Innovation of elevated temperature structural design standard is necessary to achieve this goal. The most powerful way is to enlarge the scope of structural integrity code to cover items other than design evaluation that has been addressed in existing codes. Items that must be newly covered are prerequisites of design, fabrication, examination, operation and maintenance, etc. This allows designers to choose the most economical combination of design variations to achieve specific reliability that is needed for a particular component. Designing components by this concept, a cost-minimum design of a whole plant can be realized. By determining the reliability that must be achieved for a component by risk technologies, further economical improvement can be expected by avoiding excessive quality. Recognizing the necessity for the codes based on the new concept, the development of 'FBR integrity system code' began in 2000. Research and development will last 10 years. For this development, the basic logistics and system as well as technologies that materialize the concept are necessary. Original logistics and system must be developed, because no existing researches are available in and out of Japan. This reports presents the results of the work done in the first year regarding the basic idea, methodology, and structure of the code. (author)

  5. Exploring the concept of QR Code and the benefits of using QR Code for companies

    OpenAIRE

    Ji, Qianyu

    2014-01-01

    This research work concentrates on the concept of QR Code and the benefits of using QR Code for companies. The first objective of this research work is to study the general information of QR Code in order to guide people to understand the QR Code in detail. The second objective of this research work is to explore and analyze the essential and feasible technologies of QR Code for the sake of clearing the technologies of QR code. Additionally, this research work through QR Code best practices t...

  6. 78 FR 26023 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-05-03

    ... Chalk Point, LLC, GenOn Delta, LLC, GenOn Energy Management, LLC, GenOn Florida, LP, GenOn Kendall, LLC...: PJM Interconnection, L.L.C. Description: Original Service Agreement No. 3519; Queue No. X4-046 to be...: PJM Interconnection, L.L.C. Description: Original Service Agreement No. 3524; Queue No. X3-066 to be...

  7. UEP Concepts in Modulation and Coding

    Directory of Open Access Journals (Sweden)

    Werner Henkel

    2010-01-01

    Full Text Available First unequal error protection (UEP proposals date back to the 1960's (Masnick and Wolf; 1967, but now with the introduction of scalable video, UEP develops to a key concept for the transport of multimedia data. The paper presents an overview of some new approaches realizing UEP properties in physical transport, especially multicarrier modulation, or with LDPC and Turbo codes. For multicarrier modulation, UEP bit-loading together with hierarchical modulation is described allowing for an arbitrary number of classes, arbitrary SNR margins between the classes, and arbitrary number of bits per class. In Turbo coding, pruning, as a counterpart of puncturing is presented for flexible bit-rate adaptations, including tables with optimized pruning patterns. Bit- and/or check-irregular LDPC codes may be designed to provide UEP to its code bits. However, irregular degree distributions alone do not ensure UEP, and other necessary properties of the parity-check matrix for providing UEP are also pointed out. Pruning is also the means for constructing variable-rate LDPC codes for UEP, especially controlling the check-node profile.

  8. Basic concept of common reactor physics code systems. Final report of working party on common reactor physics code systems (CCS)

    International Nuclear Information System (INIS)

    2004-03-01

    A working party was organized for two years (2001-2002) on common reactor physics code systems under the Research Committee on Reactor Physics of JAERI. This final report is compilation of activity of the working party on common reactor physics code systems during two years. Objectives of the working party is to clarify basic concept of common reactor physics code systems to improve convenience of reactor physics code systems for reactor physics researchers in Japan on their various field of research and development activities. We have held four meetings during 2 years, investigated status of reactor physics code systems and innovative software technologies, and discussed basic concept of common reactor physics code systems. (author)

  9. Assigning clinical codes with data-driven concept representation on Dutch clinical free text.

    Science.gov (United States)

    Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Goethals, Bart; Daelemans, Walter

    2017-05-01

    Clinical codes are used for public reporting purposes, are fundamental to determining public financing for hospitals, and form the basis for reimbursement claims to insurance providers. They are assigned to a patient stay to reflect the diagnosis and performed procedures during that stay. This paper aims to enrich algorithms for automated clinical coding by taking a data-driven approach and by using unsupervised and semi-supervised techniques for the extraction of multi-word expressions that convey a generalisable medical meaning (referred to as concepts). Several methods for extracting concepts from text are compared, two of which are constructed from a large unannotated corpus of clinical free text. A distributional semantic model (i.c. the word2vec skip-gram model) is used to generalize over concepts and retrieve relations between them. These methods are validated on three sets of patient stay data, in the disease areas of urology, cardiology, and gastroenterology. The datasets are in Dutch, which introduces a limitation on available concept definitions from expert-based ontologies (e.g. UMLS). The results show that when expert-based knowledge in ontologies is unavailable, concepts derived from raw clinical texts are a reliable alternative. Both concepts derived from raw clinical texts perform and concepts derived from expert-created dictionaries outperform a bag-of-words approach in clinical code assignment. Adding features based on tokens that appear in a semantically similar context has a positive influence for predicting diagnostic codes. Furthermore, the experiments indicate that a distributional semantics model can find relations between semantically related concepts in texts but also introduces erroneous and redundant relations, which can undermine clinical coding performance. Copyright © 2017. Published by Elsevier Inc.

  10. 77 FR 26438 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Approval of 2011 Consent...

    Science.gov (United States)

    2012-05-04

    ... (MDE) pertaining to the GenOn Chalk Point Generating Station (Chalk Point). These revisions approve specific provisions of a 2011 Consent Decree between MDE and GenOn to reduce particulate matter (PM... www.regulations.gov or email. The www.regulations.gov Web site is an ``anonymous access'' system...

  11. Symmetries in Genetic Systems and the Concept of Geno-Logical Coding

    Directory of Open Access Journals (Sweden)

    Sergey V. Petoukhov

    2016-12-01

    Full Text Available The genetic code of amino acid sequences in proteins does not allow understanding and modeling of inherited processes such as inborn coordinated motions of living bodies, innate principles of sensory information processing, quasi-holographic properties, etc. To be able to model these phenomena, the concept of geno-logical coding, which is connected with logical functions and Boolean algebra, is put forward. The article describes basic pieces of evidence in favor of the existence of the geno-logical code, which exists in p­arallel with the known genetic code of amino acid sequences but which serves for transferring inherited processes along chains of generations. These pieces of evidence have been received due to the analysis of symmetries in structures of molecular-genetic systems. The analysis has revealed a close connection of the genetic system with dyadic groups of binary numbers and with other mathematical objects, which are related with dyadic groups: Walsh functions (which are algebraic characters of dyadic groups, bit-reversal permutations, logical holography, etc. These results provide a new approach for mathematical modeling of genetic structures, which uses known mathematical formalisms from technological fields of noise-immunity coding of information, binary analysis, logical holography, and digital devices of artificial intellect. Some opportunities for a development of algebraic-logical biology are opened.

  12. Exotic Non-Abelian Topological Defects in Lattice Fractional Quantum Hall States

    Science.gov (United States)

    Liu, Zhao; Möller, Gunnar; Bergholtz, Emil J.

    2017-09-01

    We investigate extrinsic wormholelike twist defects that effectively increase the genus of space in lattice versions of multicomponent fractional quantum Hall systems. Although the original band structure is distorted by these defects, leading to localized midgap states, we find that a new lowest flat band representing a higher genus system can be engineered by tuning local single-particle potentials. Remarkably, once local many-body interactions in this new band are switched on, we identify various Abelian and non-Abelian fractional quantum Hall states, whose ground-state degeneracy increases with the number of defects, i.e, with the genus of space. This sensitivity of topological degeneracy to defects provides a "proof of concept" demonstration that genons, predicted by topological field theory as exotic non-Abelian defects tied to a varying topology of space, do exist in realistic microscopic models. Specifically, our results indicate that genons could be created in the laboratory by combining the physics of artificial gauge fields in cold atom systems with already existing holographic beam shaping methods for creating twist defects.

  13. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    Science.gov (United States)

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  14. The preliminary thermal-hydraulic design of one superheated steam water cooled blanket concept based on RELAP5 and MELCOR codes - 15147

    International Nuclear Information System (INIS)

    Guo, Y.; Wang, G.; Cheng, Y.; Peng, C.

    2015-01-01

    Water Cooled Blanket (WCB) is very important in the concept design and energy transfer in future fusion power plant. One concept design of WCB is under computational testing. RELAP5 and MELCOR codes, which are mature and often used in nuclear engineering, are selected as simulation tools. The complex inner flow channels and heat sources are simplified according to its thermal-hydraulic characteristics. Then the nodal models for RELAP5 and MELCOR are built for approximating the concept design. The superheated steam scheme is analyzed by two codes separately under different power levels. After some adjustments of the inlet flow resistance coefficients of some flow channels, the reasonable stable conditions can be obtained. The stable fluid and wall temperature distributions and pressure drops are studied. The results of two codes are compared and some advices are given. (authors)

  15. Implementation of probabilistic safety concepts in international codes

    International Nuclear Information System (INIS)

    Borges, J.F.

    1977-01-01

    Recent progress in the implementation of safety concepts in international structure codes is briefly presented. Special attention is paid to the work of the Joint-Committee on Structural Safety. The discussion is centered on some problems such as: safety differentiation, definition and combination of actions, spaces for checking safety and non-linear structural behaviour. When discussing safety differentiation it should be considered that the total probability of failure derives from a theoretical probability of failure and a probability of failure due to error and gross negligence. Optimization of design criteria should take into account both causes of failure. The quantification of reliability implies a probabilistic idealization of all basic variables. Steps taken to obtain an improved definition of different types of actions and rules for their combination are described. Safety checking can be carried out in terms of basic variables, action-effects, or any other suitable variable. However, the advantages and disadvantages of the different types of formulation should be discussed, particularly in the case of non-linear structural behaviour. (orig.) [de

  16. On the concept of elasticity used in some fast reactor accident analysis codes

    International Nuclear Information System (INIS)

    Malmberg, T.

    1975-01-01

    The analysis presented restricts attention to the elastic part of the elastic-plastic equation used in several Fast Reactor Accident Analysis Codes and originally applied by M.L. Wilkins: Calculation of Elastic-Plastic Flow, UCRL-7322, Rev. 1, Jan 1969. It is shown that the used elasticity concept is within the frame of hypo-elasticity. On the basis of a test found by Bernstein it is proven that the state of stress is generally depending on the path of deformation. Therefore this concept of elasticity is not compatible with finite elasticity. For several deformation processes this special hypo-elastic constitutive equation is integrated to give a stress-strain relation. The path-dependence of this relation is demonstrated. Further the phenomenon of hypo-elastic yield under shear deformation is pointed out. The relevance to modelling material behaviour in primary containment analysis is discussed. (Auth.)

  17. The Coding Process and Its Challenges

    Directory of Open Access Journals (Sweden)

    Judith A. Holton, Ph.D.

    2010-02-01

    Full Text Available Coding is the core process in classic grounded theory methodology. It is through coding that the conceptual abstraction of data and its reintegration as theory takes place. There are two types of coding in a classic grounded theory study: substantive coding, which includes both open and selective coding procedures, and theoretical coding. In substantive coding, the researcher works with the data directly, fracturing and analysing it, initially through open coding for the emergence of a core category and related concepts and then subsequently through theoretical sampling and selective coding of data to theoretically saturate the core and related concepts. Theoretical saturation is achieved through constant comparison of incidents (indicators in the data to elicit the properties and dimensions of each category (code. This constant comparing of incidents continues until the process yields the interchangeability of indicators, meaning that no new properties or dimensions are emerging from continued coding and comparison. At this point, the concepts have achieved theoretical saturation and the theorist shifts attention to exploring the emergent fit of potential theoretical codes that enable the conceptual integration of the core and related concepts to produce hypotheses that account for relationships between the concepts thereby explaining the latent pattern of social behaviour that forms the basis of the emergent theory. The coding of data in grounded theory occurs in conjunction with analysis through a process of conceptual memoing, capturing the theorist’s ideation of the emerging theory. Memoing occurs initially at the substantive coding level and proceeds to higher levels of conceptual abstraction as coding proceeds to theoretical saturation and the theorist begins to explore conceptual reintegration through theoretical coding.

  18. Formulation of Policy for Cyber Crime in Criminal Law Revision Concept of Bill Book of Criminal Law (A New Penal Code)

    Science.gov (United States)

    Soponyono, Eko; Deva Bernadhi, Brav

    2017-04-01

    Development of national legal systems is aimed to establish the public welfare and the protection of the public. Many attempts has been carried out to renew material criminal law and those efforts results in the formulation of the concept of the draft Law Book of the Law of Criminal Law in the form of concept criminal code draft. The basic ideas in drafting rules and regulation based on the values inside the idology of Pancasila are balance among various norm and rules in society. The design concept of the New Criminal Code Act is anticipatory and proactive to formulate provisions on Crime in Cyberspace and Crime on Information and Electronic Transactions. Several issues compiled in this paper are whether the policy in formulation of cyber crime is embodied in the provisions of the current legislation and what the policies formulation of cyber crime is in the concept of the bill book of law - criminal law recently?.

  19. The Ductile Design Concept for Seismic Actions in Miscellaneous Design Codes

    Directory of Open Access Journals (Sweden)

    M. Budescu

    2009-01-01

    Full Text Available The concept of ductility estimates the capacity of the structural system and its components to deform prior to collapse, without a substantial loss of strength, but with an important energy amount dissipated. Consistent with the „Applied Technology Council” (ATC-34, from 1995, it was agreed that the reduction seismic response factor to decrease the design force. The purpose of this factor is to transpose the nonlinear behaviour of the structure and the energy dissipation capacity in a simplified form that can be used in the design stage. Depending on the particular structural model and the design standard the used values are different. The paper presents the characteristics of the ductility concept for the structural system. Along with this the general way of computing the reserve factor with the necessary explanations for the parameters that determine the behaviour factor are described. The purpose of this paper is to make a comparison between different international norms for the values and the distribution of the behaviour factor. The norms from the following countries are taken into consideration: the United States of America, New Zealand, Japan, Romania and the European general seismic code.

  20. On the concept of elasticity used in some fast reactor accident analysis codes

    International Nuclear Information System (INIS)

    Malmberg, T.

    1975-01-01

    The analysis to be presented will restrict attention to the elastic part of the elastic-plastic constitutive equation used in several Fast Reactor Accident Analysis Codes and originally applied by M.L. Wilkins: Calculation of Elastic-Plastic Flow, UCRL-7322, Rev. 1, Jan. 1969. It is shown that the used elasticity concept is within the frame of hypo-elasticity. On the basis of a test found by Bernstein it is proven that the state of stress is generally depending on the path of deformation. Therefore this concept of elasticity is not compatible with finite elasticity. For several simple deformation processes this special hypo-elastic constitutive equation is integrated to give a stress-strain relation. The path-dependence of this relation is demonstrated. Further the phenomenon of hypo-elastic yield under shear deformation is pointed out. The relevance to modelling material behaviour in primary containment analysis is discussed

  1. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  2. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  3. QR CODE IN LIBRARY PRACTICE SOME EXAMPLES

    OpenAIRE

    Ajay Shanker Mishra*, Sachin Kumar Umre, Pavan Kumar Gupta

    2017-01-01

    Quick Response (QR) code is one such technology which can cater to the user demand of providing access to resources through mobile. The main objective of this article to review the concept of Quick Response Code (QR code) and describe the practice of reading and generating QR codes. Research paper attempt to the basic concept, structure, technological pros and cons of the QR code. The literature is filled with potential uses for Quick Response (QR) codes in the library practices like e-resour...

  4. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  5. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  6. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  7. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  8. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  9. Building a dynamic code to simulate new reactor concepts

    International Nuclear Information System (INIS)

    Catsaros, N.; Gaveau, B.; Jaekel, M.-T.; Maillard, J.; Maurel, G.; Savva, P.; Silva, J.; Varvayanni, M.

    2012-01-01

    Highlights: ► We develop a stochastic neutronic code based on an existing High Energy Physics code. ► The code simulates innovative reactor designs including Accelerator Driven Systems. ► Core materials evolution will be dynamically simulated, including fuel burnup. ► Continuous feedback between the main inter-related parameters will be established. ► A description of the current research development and achievements is also given. - Abstract: Innovative nuclear reactor designs have been proposed, such as the Accelerator Driven Systems (ADSs), the “candle” reactors, etc. These reactor designs introduce computational nuclear technology problems the solution of which necessitates a new, global and dynamic computational approach of the system. A continuous feedback procedure must be established between the main inter-related parameters of the system such as the chemical, physical and isotopic composition of the core, the neutron flux distribution and the temperature field. Furthermore, as far as ADSs are concerned, the ability of the computational tool to simulate the nuclear cascade created from the interaction of accelerated protons with the spallation target as well as the produced neutrons, is also required. The new Monte Carlo code ANET (Advanced Neutronics with Evolution and Thermal hydraulic feedback) is being developed based on the GEANT3 High Energy Physics code, aiming to progressively satisfy all the above requirements. A description of the capabilities and methodologies implemented in the present version of ANET is given here, together with some illustrative applications of the code.

  10. THE CONCEPT “LONDON” AS A TEMPORAL CODE OF LINGUOCULTURE IN THE LITERARY AND REGIONAL WORK OF PETER ACKROYD “LONDON: THE BIOGRAPHY”

    OpenAIRE

    Kaliev, Sultan; Zhumagulova, Batima

    2018-01-01

    This article analyzes the spatial-temporal code oflingua-culture as one of the components of the general cognitive-matrix modelof the structure of the concept "London" in the literary and regionalwork of Peter Ackroyd "London: The Biography". This approachimplements integration of cognitive-matrix modeling of the structure of theconcept and the system of codes of lingua-culture (anthropomorphic,temporal, vegetative, spiritual, social, chemical, etc.) The space-timecode of ...

  11. Multimedia signal coding and transmission

    CERN Document Server

    Ohm, Jens-Rainer

    2015-01-01

    This textbook covers the theoretical background of one- and multidimensional signal processing, statistical analysis and modelling, coding and information theory with regard to the principles and design of image, video and audio compression systems. The theoretical concepts are augmented by practical examples of algorithms for multimedia signal coding technology, and related transmission aspects. On this basis, principles behind multimedia coding standards, including most recent developments like High Efficiency Video Coding, can be well understood. Furthermore, potential advances in future development are pointed out. Numerous figures and examples help to illustrate the concepts covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by exercises. The book is also a self-contained introduction both for researchers and developers of multimedia compression systems in industry.

  12. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  13. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  14. Distributed space-time coding

    CERN Document Server

    Jing, Yindi

    2014-01-01

    Distributed Space-Time Coding (DSTC) is a cooperative relaying scheme that enables high reliability in wireless networks. This brief presents the basic concept of DSTC, its achievable performance, generalizations, code design, and differential use. Recent results on training design and channel estimation for DSTC and the performance of training-based DSTC are also discussed.

  15. Concept of 'bad death'

    Directory of Open Access Journals (Sweden)

    Marija Vučković

    2016-02-01

    Full Text Available Following previous research on the linguistic concept of а 'bad death' which lexical expression is the word family of the verb ginuti, I focus my attention in this paper on the relationship between language conceptualization of а 'bad death' and the representation of а 'bad death' in traditional and contemporary culture. Diachronically based language corpus makes possible to trace the changes of referential frame and use of verb ginuti and its derivatives. In the traditional culture а 'bad death' is marked in action code by irregular way of burial and beliefs in demons stemming from the 'impure dead'. In the paper I explore the degree of synonymy of the symbols of all three codes: verbal code, action code and code of beliefs. In the contemporary culture the lack of individual control and choice is considered to be the key element of the concept of a 'bad death'. This change of conceptual content manifests itself in the use of its lexical expressions.

  16. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  17. On the Need of Network coding for Mobile Clouds

    DEFF Research Database (Denmark)

    Fitzek, Frank; Heide, Janus; Pedersen, Morten Videbæk

    for mobile clouds. The paper will list the benefits of network coding for mobile clouds as well as introduce both concepts in a tutorial way. The results used throughout this paper are collaborative work of different research institutes, but mainly taken from the mobile device group at Aalborg University.......This paper advocates the need of network coding for mobile clouds. Mobile clouds as well as network coding are describing two novel concepts. The concept of mobile clouds describes the potential of mobile devices to communicate with each other and form a cooperative cluster in which new services...... and potentials are created. Network coding on the other side enables the mobile cloud to communicate in a very efficient and secure way in terms of energy and bandwidth usage. Even though network coding can be applied in a variety of communication networks, it has some inherent features that makes it suitable...

  18. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  19. The DIT nuclear fuel assembly physics design code

    International Nuclear Information System (INIS)

    Jonsson, A.

    1988-01-01

    The DIT code is the Combustion Engineering, Inc. (C-E) nuclear fuel assembly design code. It belongs to a class of codes, all similar in structure and strategy, that may be characterized by the spectrum and spatial calculations being performed in two dimensions and in a single job step for the entire assembly. The forerunner of this class of codes is the United Kingdom Atomic Energy Authority WIMS code, the first version of which was completed 25 yr ago. The structure and strategy of assembly spectrum codes have remained remarkably similar to the original concept thus proving its usefulness. As other organizations, including C-E, have developed their own versions of the concept, many important variations have been added that significantly influence the accuracy and performance of the resulting computational tool. Those features, which are unique to the DIT code and which might be of interest to the community of fuel assembly physics design code users and developers, are described and discussed

  20. Computer codes for ventilation in nuclear facilities

    International Nuclear Information System (INIS)

    Mulcey, P.

    1987-01-01

    In this paper the authors present some computer codes, developed in the last years, for ventilation and radioprotection. These codes are used for safety analysis in the conception, exploitation and dismantlement of nuclear facilities. The authors present particularly: DACC1 code used for aerosol deposit in sampling circuit of radiation monitors; PIAF code used for modelization of complex ventilation system; CLIMAT 6 code used for optimization of air conditioning system [fr

  1. NEW CONCEPTS IN ROMANIAN PRIVATE LAW: THE ENTERPRISE

    Directory of Open Access Journals (Sweden)

    CRISTIAN GHEORGHE

    2012-05-01

    Full Text Available The new concept of enterprise is laid down in new Civil Code in connection with another new concept: the professional (entrepreneur.The old commercial terms, commercial acts and deeds and merchant, have been well represented in legal texts in comparison with present concepts. Our new code imported these concepts together with their weaknesses from the Italian and Quebec Codes. The short references within the Code to enterprise and professional put again the burden of clarification on the scholars’ shoulders.The law defines the professionals as the persons who carry on an enterprise and therefore the legislator pursues to the ‘carrying on an enterprise” definition. Doing so, in fact the legislator leaves the enterprise concept undefined. The carrying on by one or more persons of an organised economic activity, whether or not it is “commercial” in nature, consisting of producing, administering or alienating property or providing a service, constitutes the carrying on of an enterprise. The enterprise is a term long time connected with commercial and private law. All past decades, beginning with the old Commercial code, then socialist economy and post-communist era used intensively the concept of enterprise. The meaning of this term differed substantially in every decade. Present notion need scientific scrutiny in order to crystallize a convergent approach. In our paper we will consider the notion of enterprise starting from the past perception of this concept then we will try to observe the variety of enterprises under present law.

  2. Ethical codes in business practice

    OpenAIRE

    Kobrlová, Marie

    2013-01-01

    The diploma thesis discusses the issues of ethics and codes of ethics in business. The theoretical part defines basic concepts of ethics, presents its historical development and the methods and tools of business ethics. It also focuses on ethical codes and the area of law and ethics. The practical part consists of a quantitative survey, which provides views of selected business entities of business ethics and the use of codes of ethics in practice.

  3. The general theory of convolutional codes

    Science.gov (United States)

    Mceliece, R. J.; Stanley, R. P.

    1993-01-01

    This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.

  4. The Dit nuclear fuel assembly physics design code

    International Nuclear Information System (INIS)

    Jonsson, A.

    1987-01-01

    DIT is the Combustion Engineering, Inc. (C-E) nuclear fuel assembly design code. It belongs to a class of codes, all similar in structure and strategy, which may be characterized by the spectrum and spatial calculations being performed in 2D and in a single job step for the entire assembly. The forerunner of this class of codes is the U.K.A.E.A. WIMS code, the first version of which was completed 25 years ago. The structure and strategy of assembly spectrum codes have remained remarkably similar to the original concept thus proving its usefulness. As other organizations, including C-E, have developed their own versions of the concept, many important variations have been added which significantly influence the accuracy and performance of the resulting computational tool. This paper describes and discusses those features which are unique to the DIT code and which might be of interest to the community of fuel assembly physics design code users and developers

  5. Effective coding with VHDL principles and best practice

    CERN Document Server

    Jasinski, Ricardo

    2016-01-01

    A guide to applying software design principles and coding practices to VHDL to improve the readability, maintainability, and quality of VHDL code. This book addresses an often-neglected aspect of the creation of VHDL designs. A VHDL description is also source code, and VHDL designers can use the best practices of software development to write high-quality code and to organize it in a design. This book presents this unique set of skills, teaching VHDL designers of all experience levels how to apply the best design principles and coding practices from the software world to the world of hardware. The concepts introduced here will help readers write code that is easier to understand and more likely to be correct, with improved readability, maintainability, and overall quality. After a brief review of VHDL, the book presents fundamental design principles for writing code, discussing such topics as design, quality, architecture, modularity, abstraction, and hierarchy. Building on these concepts, the book then int...

  6. Neuronal codes for visual perception and memory.

    Science.gov (United States)

    Quian Quiroga, Rodrigo

    2016-03-01

    In this review, I describe and contrast the representation of stimuli in visual cortical areas and in the medial temporal lobe (MTL). While cortex is characterized by a distributed and implicit coding that is optimal for recognition and storage of semantic information, the MTL shows a much sparser and explicit coding of specific concepts that is ideal for episodic memory. I will describe the main characteristics of the coding in the MTL by the so-called concept cells and will then propose a model of the formation and recall of episodic memory based on partially overlapping assemblies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    /validation of MS/MP capabilities because these advanced tools have not yet reached sufficient maturity to support such an investigation. In an earlier paper (Unal et al., 2011), we proposed a methodology that potentially can be used to address these new challenges in the design and licensing of evolving nuclear technology. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept was introduced and is accomplished through data assimilation. Since advanced MS/MP codes have not yet reached the level of maturity required for a comprehensive validation and calibration exercise, we considered two legacy fuel codes and apply parts of our methodology to these codes to demonstrate the benefits of the new calibration capabilities we recently developed as a part of the proposed framework. This effort does not directly support “born-assessed” validation for advanced MS/MP codes, but is useful to gain insight on legacy modeling deficiencies and to guide and develop recommendations on high and low priority directions for development of advanced codes and advanced experiments, so as to maximize the benefits of advanced validation and uncertainty quantification (VU) efforts involving the next generation of MS/MP code capabilities. This paper discusses the application of advanced validation techniques (sensitivity, calibration, and prediction) to nuclear fuel performance codes FRAPCON (Geelhood et al., 2011a,b) and LIFE-4 (Boltax et al., 1990). FRAPCON is used to predict oxide fuel behavior in light water reactors. LIFE-4 was developed in the 1980s to predict oxide fuel behavior in fast reactors. We introduce a sensitivity ranking methodology to narrow down the selected parameters for follow-up sensitivity and calibration analyses. We use screening methods with both codes and discuss the results. The number of selected modeling parameters was 61 for FRAPCON and 69 for LIFE-4. The screening

  8. A new coding concept for fast ultrasound imaging using pulse trains

    DEFF Research Database (Denmark)

    Misaridis, T.; Jensen, Jørgen Arendt

    2002-01-01

    Frame rate in ultrasound imaging can he increased by simultaneous transmission of multiple beams using coded waveforms. However, the achievable degree of orthogonality among coded waveforms is limited in ultrasound, and the image quality degrades unacceptably due to interbeam interference....... In this paper, an alternative combined time-space coding approach is undertaken. In the new method all transducer elements are excited with short pulses and the high time-bandwidth (TB) product waveforms are generated acoustically. Each element transmits a short pulse spherical wave with a constant transmit...... delay from element to element, long enough to assure no pulse overlapping for all depths in the image. Frequency shift keying is used for "per element" coding. The received signals from a point scatterer are staggered pulse trains which are beamformed for all beam directions and further processed...

  9. Electromagnetic reprogrammable coding-metasurface holograms.

    Science.gov (United States)

    Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang

    2017-08-04

    Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.

  10. Error-correction coding for digital communications

    Science.gov (United States)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  11. Organization of Risk Analysis Codes for Living Evaluations (ORACLE)

    International Nuclear Information System (INIS)

    Batt, D.L.; MacDonald, P.E.; Sattison, M.B.; Vesely, E.

    1987-01-01

    ORACLE (Organization of Risk Analysis Codes for Living Evaluations) is an integration concept for using risk-based information in United States Nuclear Regulatory Commission (USNRC) applications. Portions of ORACLE are being developed at the Idaho Nationale Engineering Laboratory for the USNRC. The ORACLE concept consists of related databases, software, user interfaces, processes, and quality control checks allowing a wide variety of regulatory problems and activities to be addressed using current, updated PRA information. The ORACLE concept provides for smooth transitions between one code and the next without pre- or post-processing. (orig.)

  12. Specialized Monte Carlo codes versus general-purpose Monte Carlo codes

    International Nuclear Information System (INIS)

    Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi

    2002-01-01

    The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)

  13. Structured Review of Code Clone Literature

    NARCIS (Netherlands)

    Hordijk, W.T.B.; Ponisio, Laura; Wieringa, Roelf J.

    2008-01-01

    This report presents the results of a structured review of code clone literature. The aim of the review is to assemble a conceptual model of clone-related concepts which helps us to reason about clones. This conceptual model unifies clone concepts from a wide range of literature, so that findings

  14. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  15. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  16. Nevada Administrative Code for Special Education Programs.

    Science.gov (United States)

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  17. Code of a Tokamak Fusion Energy Facility ITER

    International Nuclear Information System (INIS)

    Yasuhide Asada; Kenzo Miya; Kazuhiko Hada; Eisuke Tada

    2002-01-01

    The technical structural code for ITER (International Thermonuclear Experimental Fusion Reactor) and, as more generic applications, for D-T burning fusion power facilities (hereafter, Fusion Code) should be innovative because of their quite different features of safety and mechanical components from nuclear fission reactors, and the necessity of introducing several new fabrication and examination technologies. Introduction of such newly developed technologies as inspection-free automatic welding into the Fusion Code is rationalized by a pilot application of a new code concept of s ystem-based code for integrity . The code concept means an integration of element technical items necessary for construction, operation and maintenance of mechanical components of fusion power facilities into a single system to attain an optimization of the total margin of these components. Unique and innovative items of the Fusion Code are typically as follows: - Use of non-metals; - Cryogenic application; - New design margins on allowable stresses, and other new design rules; - Use of inspection-free automatic welding, and other newly developed fabrication technologies; - Graded approach of quality assurance standard to cover radiological safety-system components as well as non-safety-system components; - Consideration on replacement components. (authors)

  18. A Semantic Analysis Method for Scientific and Engineering Code

    Science.gov (United States)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  19. Coded ultrasonic remote control without batteries

    International Nuclear Information System (INIS)

    Gerhardy, C; Burlage, K; Schomburg, W K

    2009-01-01

    A concept for battery-less remote controls has been developed based on mechanically actuated beams and micro whistles generating ultrasound signals. These signals need to be frequency or time coded to increase the number of signals which can be distinguished from each other and environmental ultrasound. Several designs for generating coded ultrasonic signals have been investigated

  20. 77 FR 58121 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-09-19

    ... following exempt wholesale generator filings: Docket Numbers: EG12-108-000. Applicants: Prairie Rose Wind... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice.... Applicants: NRG Energy, Inc, GenOn Energy, Inc. Description: NRG Energy, Inc et al. submits additional...

  1. A new concept of equivalent homogenization method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Pogoskekyan, Leonid; Kim, Young Il; Ju, Hyung Kook; Chang, Moon Hee [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-07-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The offered concept covers both those of K. Koebke and K. Smith; both of them can be simulated within framework of new concept. Also, the offered concept covers Siemens KWU approach for baffle/reflector simulation, where the equivalent homogenized reflector XS are derived from the conservation of response matrix at the interface in 1D simi-infinite slab geometry. The IM and XS of new concept satisfy the same assumption about response matrix conservation in 1D semi-infinite slab geometry. It is expected that the new concept provides more accurate approximation of heterogeneous cell, especially in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are: improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith`s approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b) control blades simulation; (c) mixed UO{sub 2}/MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANDOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions. 9 figs., 7 refs. (Author).

  2. Applications guide to the MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-08-01

    A practical guide for the implementation of the MORESE-CG Monte Carlo radiation transport computer code system is presented. The various versions of the MORSE code are compared and contrasted, and the many references dealing explicitly with the MORSE-CG code are reviewed. The treatment of angular scattering is discussed, and procedures for obtaining increased differentiality of results in terms of reaction types and nuclides from a multigroup Monte Carlo code are explained in terms of cross-section and geometry data manipulation. Examples of standard cross-section data input and output are shown. Many other features of the code system are also reviewed, including (1) the concept of primary and secondary particles, (2) fission neutron generation, (3) albedo data capability, (4) DOMINO coupling, (5) history file use for post-processing of results, (6) adjoint mode operation, (7) variance reduction, and (8) input/output. In addition, examples of the combinatorial geometry are given, and the new array of arrays geometry feature (MARS) and its three-dimensional plotting code (JUNEBUG) are presented. Realistic examples of user routines for source, estimation, path-length stretching, and cross-section data manipulation are given. A deatiled explanation of the coupling between the random walk and estimation procedure is given in terms of both code parameters and physical analogies. The operation of the code in the adjoint mode is covered extensively. The basic concepts of adjoint theory and dimensionality are discussed and examples of adjoint source and estimator user routines are given for all common situations. Adjoint source normalization is explained, a few sample problems are given, and the concept of obtaining forward differential results from adjoint calculations is covered. Finally, the documentation of the standard MORSE-CG sample problem package is reviewed and on-going and future work is discussed

  3. A DOE Computer Code Toolbox: Issues and Opportunities

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2001-01-01

    The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications

  4. Codes of Practice related to Harbour and Coastal Engineering in Denmark

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    2000-01-01

    Codes of practice for building and civil engineering works have been produced since 1893 by the "Danish Society of Engineers". Among the early codes are: Reinforces concrete structures (1908, 1943), calculation of reinforced concrete structures in harbour works (1926), Harbour Works (1927), Steel...... structures (1941). The codes were based on the principle of allowable stresses. However, already in 1948 a Danish consulting engineer used a partial safety factor concept for a power station design in order to secure satisfactory safety. The concept was in fact old as it was used by Gerber in his design...

  5. Object-Oriented Programming in the Development of Containment Analysis Code

    International Nuclear Information System (INIS)

    Han, Tae Young; Hong, Soon Joon; Hwang, Su Hyun; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    After the mid 1980s, the new programming concept, Object-Oriented Programming (OOP), was introduced and designed, which has the features such as the information hiding, encapsulation, modularity and inheritance. These offered much more convenient programming paradigm to code developers. The OOP concept was readily developed into the programming language as like C++ in the 1990s and is being widely used in the modern software industry. In this paper, we show that the OOP concept is successfully applicable to the development of safety analysis code for containment and propose the more explicit and easy OOP design for developers

  6. Coding and decoding in a point-to-point communication using the polarization of the light beam.

    Science.gov (United States)

    Kavehvash, Z; Massoumian, F

    2008-05-10

    A new technique for coding and decoding of optical signals through the use of polarization is described. In this technique the concept of coding is translated to polarization. In other words, coding is done in such a way that each code represents a unique polarization. This is done by implementing a binary pattern on a spatial light modulator in such a way that the reflected light has the required polarization. Decoding is done by the detection of the received beam's polarization. By linking the concept of coding to polarization we can use each of these concepts in measuring the other one, attaining some gains. In this paper the construction of a simple point-to-point communication where coding and decoding is done through polarization will be discussed.

  7. CONSUL code package application for LMFR core calculations

    Energy Technology Data Exchange (ETDEWEB)

    Chibinyaev, A.V.; Teplov, P.S.; Frolova, M.V. [RNC ' Kurchatovskiy institute' , Kurchatov sq.1, Moscow (Russian Federation)

    2008-07-01

    CONSUL code package designed for the calculation of reactor core characteristics has been developed at the beginning of 90's. The calculation of nuclear reactor core characteristics is carried out on the basis of correlated neutron, isotope and temperature distributions. The code package has been generally used for LWR core characteristics calculations. At present CONSUL code package was adapted to calculate liquid metal fast reactors (LMFR). The comparisons with IAEA computational test 'Evaluation of benchmark calculations on a fast power reactor core with near zero sodium void effect' and BN-1800 testing calculations are presented in the paper. The IAEA benchmark core is based on the innovative core concept with sodium plenum above the core BN-800. BN-1800 core is the next development step which is foreseen for the Russian fast reactor concept. The comparison of the operational parameters has shown good agreement and confirms the possibility of CONSUL code package application for LMFR core calculation. (authors)

  8. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  9. Restrictive concept of surrogacy in the draft text of the Civil Code of Serbia

    Directory of Open Access Journals (Sweden)

    Bordaš Bernadet I.

    2015-01-01

    Full Text Available The working draft of the Civil Code of Serbia, which was published in June 2015, includes model-provisions on surrogate motherhood, which is, at present, expressly prohibited by law. The paper gives a survey of the proposed provisions and examines particularly those that define which persons can conclude a contract on surrogacy. By limiting this right to persons holding the nationality of Serbia, or to these nationals and persons residing in the territory of Serbia for at least three (five years the legislator wish to avoid reproductive tourism. Surrogate mothering with cross-border effects gives rise to complicated legal problems as regards the definition and recognition of legal parentage of the intended parents both in the countries in which the surrogate mother gives birth to the child, as well as in countries in which the intended parents wish to live with their child. The restrictive concept which retains surrogate mothering within the borders of the domestic state and between domestic nationals disables outgoing cases of surrogate motherhood, but it is not quite true for persons who are not citizens of Serbia, but living on its territory. For these reasons the paper critically examines these limitations in the proposals, and indicates that the incoming cases of surrogate motherhood cannot be prevented due to the free movement of people. The paper also provides analysis of the legal issues of the incoming cases of surrogate motherhood, and suggests solution for them if in the future Civil Code the proposed ipso jure legal parenthood of intended parents will be adopted. With ipso jure legal parenthood of a child who is born to a surrogate mother abroad there is no need to restrict surrogacy cases to nationals of Serbia or to foreigners domiciled in Serbia for three (five years minimum.

  10. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  11. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas

    2011-08-01

    Fixed-rate and rateless channel codes are generally treated separately in the related research literature and so, a novice in the field inevitably gets the impression that these channel codes are unrelated. By contrast, in this treatise, we endeavor to further develop a link between the traditional fixed-rate codes and the recently developed rateless codes by delving into their underlying attributes. This joint treatment is beneficial for two principal reasons. First, it facilitates the task of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well-understood codedesign tools-originally contrived for fixed-rate codes-to the realm of rateless codes. Indeed, these versatile tools proved to be vital in the design of diverse fixed-rate-coded communications systems, and thus our hope is that they will further elucidate the associated performance ramifications of the rateless coded schemes. © 2011 IEEE.

  12. Construction and decoding of a class of algebraic geometry codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Jensen, Helge Elbrønd

    1989-01-01

    A class of codes derived from algebraic plane curves is constructed. The concepts and results from algebraic geometry that were used are explained in detail; no further knowledge of algebraic geometry is needed. Parameters, generator and parity-check matrices are given. The main result is a decod...... is a decoding algorithm which turns out to be a generalization of the Peterson algorithm for decoding BCH decoder codes......A class of codes derived from algebraic plane curves is constructed. The concepts and results from algebraic geometry that were used are explained in detail; no further knowledge of algebraic geometry is needed. Parameters, generator and parity-check matrices are given. The main result...

  13. Development of computer code on sodium-water reaction products transport

    International Nuclear Information System (INIS)

    Arikawa, H.; Yoshioka, N.; Suemori, M.; Nishida, K.

    1988-01-01

    The LMFBR concept eliminating the secondary sodium system has been considered to be one of the most promissing concepts for offering cost reductions. In this reactor concept, the evaluation of effects on reactor core by the sodium-water reaction products (SWRPs) during sodium-water reaction at primary steam generator becomes one of the major safety issues. In this study, the calculation code was developed as the first step of the processes of establishing the evaluation method for SWRP effects. The calculation code, called SPROUT, simulates the SWRPs transport and distribution in primary sodium system using the system geometry, thermal hydraulic data and sodium-water reacting conditions as input. This code principally models SWRPs behavior. The paper contain the modelings for SWRPs behaviors, with solution, precipation, deposition and so on, and the results and discussions of the demonstration calculation for a typical FBR plant eliminating the secondary sodium system

  14. Proof of Concept Coded Aperture Miniature Mass Spectrometer Using a Cycloidal Sector Mass Analyzer, a Carbon Nanotube (CNT) Field Emission Electron Ionization Source, and an Array Detector

    Science.gov (United States)

    Amsden, Jason J.; Herr, Philip J.; Landry, David M. W.; Kim, William; Vyas, Raul; Parker, Charles B.; Kirley, Matthew P.; Keil, Adam D.; Gilchrist, Kristin H.; Radauscher, Erich J.; Hall, Stephen D.; Carlson, James B.; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T.; Russell, Zachary E.; Grego, Sonia; Edwards, Steven J.; Sperline, Roger P.; Denton, M. Bonner; Stoner, Brian R.; Gehm, Michael E.; Glass, Jeffrey T.

    2018-02-01

    Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified.

  15. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  16. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  17. Characterization of coded random access with compressive sensing based multi user detection

    DEFF Research Database (Denmark)

    Ji, Yalei; Stefanovic, Cedomir; Bockelmann, Carsten

    2014-01-01

    The emergence of Machine-to-Machine (M2M) communication requires new Medium Access Control (MAC) schemes and physical (PHY) layer concepts to support a massive number of access requests. The concept of coded random access, introduced recently, greatly outperforms other random access methods...... coded random access with CS-MUD on the PHY layer and show very promising results for the resulting protocol....

  18. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Three-dimensional numerical investigation of a Molten Salt reactor concept with the code CFX-5.5

    International Nuclear Information System (INIS)

    Yamaji, B.; Csom, G.; Aszodi, A.

    2002-01-01

    Partitioning and transmutation of actinides and long-lived fission products is a promising option to extend the possibilities and enhance the environmentally acceptable capabilities of nuclear energy. Also the possible implementation of the thorium cycle is considered as a way to reduce the problem of energy resources in the future. For both objectives different molten salt reactor concepts were proposed mainly based on the Molten Salt Reactor Experiment of the Oak Ridge National Laboratory. Not only critical reactors but also accelerator-driven subcritical systems (ADSs) have advantages worth considering for those aims, especially those ones with liquid fuel, such as molten salts. By using liquid fuel which is the coolant medium, too, a basically different thermalhydraulic behavior is expected than in the case of solid fuel and water coolant. In this work our purpose is to present the possible use of Computational Fluid Dynamics (CFD) technology in molten salt thermal hydraulics. The simulations were performed with the three-dimensional code CFX-5.5.(author)

  20. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials.

    Science.gov (United States)

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin; Cui, Tie Jun

    2017-09-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits "0" and "1" to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency-spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments.

  1. 77 FR 26474 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Approval of 2011 Consent...

    Science.gov (United States)

    2012-05-04

    ... Department of the Environment (MDE). These revisions approve specific provisions of a 2011 Consent Decree between MDE and GenOn to reduce particulate matter (PM), sulfur oxides (SO X ), and nitrogen oxides (NO X....regulations.gov or email. The www.regulations.gov Web site is an ``anonymous access'' system, which means EPA...

  2. Implementation of Neutronics Analysis Code using the Features of Object Oriented Programming via Fortran90/95

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young; Cho, Beom Jin [KEPCO Nuclear Fuel, Daejeon (Korea, Republic of)

    2011-05-15

    The object-oriented programming (OOP) concept was radically established after 1990s and successfully involved in Fortran 90/95. The features of OOP are such as the information hiding, encapsulation, modularity and inheritance, which lead to producing code that satisfy three R's: reusability, reliability and readability. The major OOP concepts, however, except Module are not mainly used in neutronics analysis codes even though the code was written by Fortran 90/95. In this work, we show that the OOP concept can be employed to develop the neutronics analysis code, ASTRA1D (Advanced Static and Transient Reactor Analyzer for 1-Dimension), via Fortran90/95 and those can be more efficient and reasonable programming methods

  3. The code of ethics for nurses.

    Science.gov (United States)

    Zahedi, F; Sanjari, M; Aala, M; Peymani, M; Aramesh, K; Parsapour, A; Maddah, Ss Bagher; Cheraghi, Ma; Mirzabeigi, Gh; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Nurses are ever-increasingly confronted with complex concerns in their practice. Codes of ethics are fundamental guidance for nursing as many other professions. Although there are authentic international codes of ethics for nurses, the national code would be the additional assistance provided for clinical nurses in their complex roles in care of patients, education, research and management of some parts of health care system in the country. A national code can provide nurses with culturally-adapted guidance and help them to make ethical decisions more closely to the Iranian-Islamic background. Given the general acknowledgement of the need, the National Code of Ethics for Nurses was compiled as a joint project (2009-2011). The Code was approved by the Health Policy Council of the Ministry of Health and Medical Education and communicated to all universities, healthcare centers, hospitals and research centers early in 2011. The focus of this article is on the course of action through which the Code was compiled, amended and approved. The main concepts of the code will be also presented here. No doubt, development of the codes should be considered as an ongoing process. This is an overall responsibility to keep the codes current, updated with the new progresses of science and emerging challenges, and pertinent to the nursing practice.

  4. 78 FR 41050 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-07-09

    ..., High Plains Ranch II, LLC, Green Mountain Energy Company, GenOn Energy Management, LLC, GenConn Energy... Interconnection, L.L.C. Description: Original Service Agreement No. 3585--Queue Position Y1-072 to be effective 5.... Description: Notice of Cancellation of Original Service Agreement No. 2720; Queue No. V4-001 to be effective 5...

  5. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas; Yang, Yuli; Aï ssa, Sonia; Hanzo, Lajos

    2011-01-01

    of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well

  6. Optical encryption and QR codes: secure and noise-free information retrieval.

    Science.gov (United States)

    Barrera, John Fredy; Mira, Alejandro; Torroba, Roberto

    2013-03-11

    We introduce for the first time the concept of an information "container" before a standard optical encrypting procedure. The "container" selected is a QR code which offers the main advantage of being tolerant to pollutant speckle noise. Besides, the QR code can be read by smartphones, a massively used device. Additionally, QR code includes another secure step to the encrypting benefits the optical methods provide. The QR is generated by means of worldwide free available software. The concept development probes that speckle noise polluting the outcomes of normal optical encrypting procedures can be avoided, then making more attractive the adoption of these techniques. Actual smartphone collected results are shown to validate our proposal.

  7. Key concepts in glioblastoma therapy

    DEFF Research Database (Denmark)

    Bartek, Jiri; Ng, Kimberly; Bartek, Jiri

    2012-01-01

    principles that drive the formulation of therapeutic strategies in glioblastoma. Specifically, the concepts of tumour heterogeneity, oncogene addiction, non-oncogene addiction, tumour initiating cells, tumour microenvironment, non-coding sequences and DNA damage response will be reviewed....

  8. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...

  9. 78 FR 57146 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-09-17

    ... Management, LLC, GenOn Mid-Atlantic, LLC, Green Mountain Energy Company, High Plains Ranch II, LLC, Huntley... Revised Service Agreement No. 3452; Queue No. Y1-020 to be effective 8/8/2013. Filed Date: 9/9/13... Agreement No. 3639--Queue Position W4-038 to be effective 8/8/2013. Filed Date: 9/9/13. Accession Number...

  10. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  11. Self-consistent Analysis of a Blanket and Shielding of a Fusion Reactor Concept

    International Nuclear Information System (INIS)

    Kim, Suk Kwon; Hong, B. G.; Lee, D. W.; Kim, D. H.; Lee, Y. O.

    2008-01-01

    To develop the concept of a DEMO reactor, a tokamak reactor system analysis code has been developed at KAERI (Korea Atomic Energy Research Institute). The system analysis code incorporates prospects of the development of plasma physics and the technologies in a simple mathematical model and it helps to develop the concept of a fusion reactor and to identify the necessary R and D areas for a realization of the concept. In the system code, a plant power balance equation and a plasma power balance equation are solved to find plant parameters which satisfy the plasma physics and technology constraints, simultaneously. The outcome of the system analysis is to identify which areas of plasma physics and technologies and to what extent they should be developed for a realization of given fusion reactor concepts

  12. Qualifying codes under software quality assurance: Two examples as guidelines for codes that are existing or under development

    Energy Technology Data Exchange (ETDEWEB)

    Mangold, D.

    1993-05-01

    Software quality assurance is an area of concem for DOE, EPA, and other agencies due to the poor quality of software and its documentation they have received in the past. This report briefly summarizes the software development concepts and terminology increasingly employed by these agencies and provides a workable approach to scientific programming under the new requirements. Following this is a practical description of how to qualify a simulation code, based on a software QA plan that has been reviewed and officially accepted by DOE/OCRWM. Two codes have recently been baselined and qualified, so that they can be officially used for QA Level 1 work under the DOE/OCRWM QA requirements. One of them was baselined and qualified within one week. The first of the codes was the multi-phase multi-component flow code TOUGH version 1, an already existing code, and the other was a geochemistry transport code STATEQ that was under development The way to accomplish qualification for both types of codes is summarized in an easy-to-follow step-by step fashion to illustrate how to baseline and qualify such codes through a relatively painless procedure.

  13. Qualifying codes under software quality assurance: Two examples as guidelines for codes that are existing or under development

    International Nuclear Information System (INIS)

    Mangold, D.

    1993-05-01

    Software quality assurance is an area of concern for DOE, EPA, and other agencies due to the poor quality of software and its documentation they have received in the past. This report briefly summarizes the software development concepts and terminology increasingly employed by these agencies and provides a workable approach to scientific programming under the new requirements. Following this is a practical description of how to qualify a simulation code, based on a software QA plan that has been reviewed and officially accepted by DOE/OCRWM. Two codes have recently been baselined and qualified, so that they can be officially used for QA Level 1 work under the DOE/OCRWM QA requirements. One of them was baselined and qualified within one week. The first of the codes was the multi-phase multi-component flow code TOUGH version 1, an already existing code, and the other was a geochemistry transport code STATEQ that was under development The way to accomplish qualification for both types of codes is summarized in an easy-to-follow step-by step fashion to illustrate how to baseline and qualify such codes through a relatively painless procedure

  14. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  15. CONCEPT-5 user's manual

    International Nuclear Information System (INIS)

    Hudson, C.R. II.

    1979-01-01

    The CONCEPT computer code package was developed to provide conceptual capital cost estimates for nuclear-fueled and fossil-fired power plants. Cost estimates can be made as a function of plant type, size, location, and date of initial operation. The output includes a detailed breakdown of the estimate into direct and indirect costs similar to the accounting system described in document NUS--531. Cost models are currently provided in CONCEPT 5 for single- and multiunit pressurized-water reactors, boiling-water reactors, and cost-fired plants with and without flue gas desulfurization equipment

  16. Radio frequency channel coding made easy

    CERN Document Server

    Faruque, Saleh

    2016-01-01

    This book introduces Radio Frequency Channel Coding to a broad audience. The author blends theory and practice to bring readers up-to-date in key concepts, underlying principles and practical applications of wireless communications. The presentation is designed to be easily accessible, minimizing mathematics and maximizing visuals.

  17. Construction of self-dual codes in the Rosenbloom-Tsfasman metric

    Science.gov (United States)

    Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin

    2017-12-01

    Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.

  18. Interactive game programming with Python (CodeSkulptor)

    OpenAIRE

    Ajayi, Richard Olugbenga

    2014-01-01

    Over the years, several types of gaming platforms have been created to encourage a more organised and friendly atmosphere for game lovers in various works of life, culture, and environment. This thesis focuses on the concept of interactive programming using Python. It encourages the use of Python to create simple interactive games applications based on basic human concept and ideas. CodeSkulptor is a browser-based IDE programming environment and uses the Python programming language. O...

  19. Description and application of the AERIN Code at LLNL

    International Nuclear Information System (INIS)

    King, W.C.

    1986-01-01

    The AERIN code was written at the Lawrence Livermore National Laboratory in 1976 to compute the organ burdens and absorbed dose resulting from a chronic or acute inhalation of transuranic isotopes. The code was revised in 1982 to reflect the concepts of ICRP-30. This paper will describe the AERIN code and how it has been used at LLNL to study more than 80 cases of internal deposition and obtain estimates of internal dose. A comparison with the computed values of the committed organ dose is made with ICRP-30 values. The benefits of using the code are described. 3 refs., 3 figs., 6 tabs

  20. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    Science.gov (United States)

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  1. Setting live coding performance in wider historical contexts

    OpenAIRE

    Norman, Sally Jane

    2016-01-01

    This paper sets live coding in the wider context of performing arts, construed as the poetic modelling and projection of liveness. Concepts of liveness are multiple, evolving, and scale-dependent: entities considered live from different cultural perspectives range from individual organisms and social groupings to entire ecosystems, and consequently reflect diverse temporal and spatial orders. Concepts of liveness moreover evolve with our tools, which generate and reveal new senses and places ...

  2. Analysis and Multipoint Design of the TCA Concept

    Science.gov (United States)

    Krist, Steven E.; Bauer, Steven X. S.; Buning, Pieter G.

    1999-01-01

    The goal in this effort is to analyze the baseline TCA concept at transonic and supersonic cruise, then apply the natural flow wing design concept to obtain multipoint performance improvements. Analyses are conducted with OVERFLOW, a Navier-Stokes code for overset grids, using PEGSUS to compute the interpolations between the overset grids.

  3. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  4. Reliability in the performance-based concept of fib Model Code 2010

    NARCIS (Netherlands)

    Bigaj-van Vliet, A.; Vrouwenvelder, T.

    2013-01-01

    The design philosophy of the new fib Model Code for Concrete Structures 2010 represents the state of the art with regard to performance-based approach to the design and assessment of concrete structures. Given the random nature of quantities determining structural behaviour, the assessment of

  5. Practicing the Code of Ethics, finding the image of God.

    Science.gov (United States)

    Hoglund, Barbara A

    2013-01-01

    The Code of Ethics for Nurses gives a professional obligation to practice in a compassionate and respectful way that is unaffected by the attributes of the patient. This article explores the concept "made in the image of God" and the complexities inherent in caring for those perceived as exhibiting distorted images of God. While the Code provides a professional standard consistent with a biblical worldview, human nature impacts the ability to consistently act congruently with the Code. Strategies and nursing interventions that support development of practice from a biblical worldview and the Code of Ethics for Nurses are presented.

  6. Flexible digital modulation and coding synthesis for satellite communications

    Science.gov (United States)

    Vanderaar, Mark; Budinger, James; Hoerig, Craig; Tague, John

    1991-01-01

    An architecture and a hardware prototype of a flexible trellis modem/codec (FTMC) transmitter are presented. The theory of operation is built upon a pragmatic approach to trellis-coded modulation that emphasizes power and spectral efficiency. The system incorporates programmable modulation formats, variations of trellis-coding, digital baseband pulse-shaping, and digital channel precompensation. The modulation formats examined include (uncoded and coded) binary phase shift keying (BPSK), quatenary phase shift keying (QPSK), octal phase shift keying (8PSK), 16-ary quadrature amplitude modulation (16-QAM), and quadrature quadrature phase shift keying (Q squared PSK) at programmable rates up to 20 megabits per second (Mbps). The FTMC is part of the developing test bed to quantify modulation and coding concepts.

  7. Digital color acquisition, perception, coding and rendering

    CERN Document Server

    Fernandez-Maloigne, Christine; Macaire, Ludovic

    2013-01-01

    In this book the authors identify the basic concepts and recent advances in the acquisition, perception, coding and rendering of color. The fundamental aspects related to the science of colorimetry in relation to physiology (the human visual system) are addressed, as are constancy and color appearance. It also addresses the more technical aspects related to sensors and the color management screen. Particular attention is paid to the notion of color rendering in computer graphics. Beyond color, the authors also look at coding, compression, protection and quality of color images and videos.

  8. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  9. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  10. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  11. Optical code-division multiple-access networks

    Science.gov (United States)

    Andonovic, Ivan; Huang, Wei

    1999-04-01

    This review details the approaches adopted to implement classical code division multiple access (CDMA) principles directly in the optical domain, resulting in all optical derivatives of electronic systems. There are a number of ways of realizing all-optical CDMA systems, classified as incoherent and coherent based on spreading in the time and frequency dimensions. The review covers the basic principles of optical CDMA (OCDMA), the nature of the codes used in these approaches and the resultant limitations on system performance with respect to the number of stations (code cardinality), the number of simultaneous users (correlation characteristics of the families of codes), concluding with consideration of network implementation issues. The latest developments will be presented with respect to the integration of conventional time spread codes, used in the bulk of the demonstrations of these networks to date, with wavelength division concepts, commonplace in optical networking. Similarly, implementations based on coherent correlation with the aid of a local oscillator will be detailed and comparisons between approaches will be drawn. Conclusions regarding the viability of these approaches allowing the goal of a large, asynchronous high capacity optical network to be realized will be made.

  12. Psacoin level 1A intercomparison probabilistic system assessment code (PSAC) user group

    International Nuclear Information System (INIS)

    Nies, A.; Laurens, J.M.; Galson, D.A.; Webster, S.

    1990-01-01

    This report describes an international code intercomparison exercise conducted by the NEA Probabilistic System Assessment Code (PSAC) User Group. The PSACOIN Level 1A exercise is the third of a series designed to contribute to the verification of probabilistic codes that may be used in assessing the safety of radioactive waste disposal systems or concepts. Level 1A is based on a more realistic system model than that used in the two previous exercises, and involves deep geological disposal concepts with a relatively complex structure of the repository vault. The report compares results and draws conclusions with regard to the use of different modelling approaches and the possible importance to safety of various processes within and around a deep geological repository. In particular, the relative significance of model uncertainty and data variability is discussed

  13. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, transparency, neutrality, impartiality, effectiveness, accountability, and legality. The normative context of public administration, as expressed in codes, seems to ignore the New Public Management and Reinventing Government reform movements....

  14. A compendium of computer codes in fault tree analysis

    International Nuclear Information System (INIS)

    Lydell, B.

    1981-03-01

    In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)

  15. Particle tracing code for multispecies gas

    International Nuclear Information System (INIS)

    Eaton, R.R.; Fox, R.L.; Vandevender, W.H.

    1979-06-01

    Details are presented for the development of a computer code designed to calculate the flow of a multispecies gas mixture using particle tracing techniques. The current technique eliminates the need for a full simulation by utilizing local time averaged velocity distribution functions to obtain the dynamic properties for probable collision partners. The development of this concept reduces statistical scatter experienced in conventional Monte Carlo simulations. The technique is applicable to flow problems involving gas mixtures with disparate masses and trace constituents in the Knudsen number, Kn, range from 1.0 to less than 0.01. The resulting code has previously been used to analyze several aerodynamic isotope enrichment devices

  16. Behavior Analysis Usage with Behavior Tures Adoption for Malicious Code Detection on JAVASCRIPT Scenarios Example

    Directory of Open Access Journals (Sweden)

    Y. M. Tumanov

    2010-03-01

    Full Text Available The article offers the method of malicious JavaScript code detection, based on behavior analysis. Conceptions of program behavior, program state, an algorithm of malicious code detection are described.

  17. Promoting Transfer of Ecosystems Concepts

    Science.gov (United States)

    Yu, Yawen; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Eberbach, Catherine; Sinha, Suparna

    2016-01-01

    This study examines to what extent students transferred their knowledge from a familiar aquatic ecosystem to an unfamiliar rainforest ecosystem after participating in a technology-rich inquiry curriculum. We coded students' drawings for components of important ecosystems concepts at pre- and posttest. Our analysis examined the extent to which each…

  18. TEACHERS’ AND STUDENTS’ ATTITUDE TOWARD CODE ALTERNATION IN PAKISTANI ENGLISH CLASSROOMS

    Directory of Open Access Journals (Sweden)

    Aqsa Tahir

    2016-11-01

    Full Text Available This research is an attempt to explore students‟ and teachers‟ attitude towards code alternation within English classrooms in Pakistan. In a country like Pakistan where official language is English, the national language is Urdu, and every province has its own language, most of the people are bilinguals or multilingual. Therefore, the aim of this study was to find out when and why teachers code switch in L2 English classrooms. It has also explored student‟s preferences of language during learning second language. It has also looked into teachers‟ code-switching patterns and the students‟ priorities. Ten teachers responded to an open ended questioner and 100 students responded to a close ended questioner. Results of teacher‟s responses indicated that they mostly code switch when student‟s response in relation to the comprehensibility is negative and they do not grasp the concepts easily in L2. They never encourage students to speak Urdu. Student‟s results showed that they mostly prefer code-switching into their L1 for better understanding and participation in class. Analysis revealed that students only favored English while getting instructions of test, receiving results, and learning grammatical concepts. In most of the cases, students showed flexibility in language usage. Majority of students (68% agreed upon that they learn better when their teachers code switch in to L1.

  19. A Unique Perspective on Data Coding and Decoding

    Directory of Open Access Journals (Sweden)

    Wen-Yan Wang

    2010-12-01

    Full Text Available The concept of a loss-less data compression coding method is proposed, and a detailed description of each of its steps follows. Using the Calgary Corpus and Wikipedia data as the experimental samples and compared with existing algorithms, like PAQ or PPMstr, the new coding method could not only compress the source data, but also further re-compress the data produced by the other compression algorithms. The final files are smaller, and by comparison with the original compression ratio, at least 1% redundancy could be eliminated. The new method is simple and easy to realize. Its theoretical foundation is currently under study. The corresponding Matlab source code is provided in  the Appendix.

  20. SEAPATH: A microcomputer code for evaluating physical security effectiveness using adversary sequence diagrams

    International Nuclear Information System (INIS)

    Darby, J.L.

    1986-01-01

    The Adversary Sequence Diagram (ASD) concept was developed by Sandia National Laboratories (SNL) to examine physical security system effectiveness. Sandia also developed a mainframe computer code, PANL, to analyze the ASD. The authors have developed a microcomputer code, SEAPATH, which also analyzes ASD's. The Authors are supporting SNL in software development of the SAVI code; SAVI utilizes the SEAPATH algorithm to identify and quantify paths

  1. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  2. Nuclear component design ontology building based on ASME codes

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2005-01-01

    The adoption of ontology analysis in the study of concept knowledge acquisition and representation for the nuclear component design process based on computer-supported cooperative work (CSCW) makes it possible to share and reuse numerous concept knowledge of multi-disciplinary domains. A practical ontology building method is accordingly proposed based on Protege knowledge model in combination with both top-down and bottom-up approaches together with Formal Concept Analysis (FCA). FCA exhibits its advantages in the way it helps establish and improve taxonomic hierarchy of concepts and resolve concept conflict occurred in modeling multi-disciplinary domains. With Protege-3.0 as the ontology building tool, a nuclear component design ontology based ASME codes is developed by utilizing the ontology building method. The ontology serves as the basis to realize concept knowledge sharing and reusing of nuclear component design. (authors)

  3. Periodic Boundary Conditions in the ALEGRA Finite Element Code

    International Nuclear Information System (INIS)

    Aidun, John B.; Robinson, Allen C.; Weatherby, Joe R.

    1999-01-01

    This document describes the implementation of periodic boundary conditions in the ALEGRA finite element code. ALEGRA is an arbitrary Lagrangian-Eulerian multi-physics code with both explicit and implicit numerical algorithms. The periodic boundary implementation requires a consistent set of boundary input sets which are used to describe virtual periodic regions. The implementation is noninvasive to the majority of the ALEGRA coding and is based on the distributed memory parallel framework in ALEGRA. The technique involves extending the ghost element concept for interprocessor boundary communications in ALEGRA to additionally support on- and off-processor periodic boundary communications. The user interface, algorithmic details and sample computations are given

  4. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    Science.gov (United States)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  5. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  6. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  7. Transient Analysis Needs for Generation IV Reactor Concepts

    International Nuclear Information System (INIS)

    Siefken, L.J.; Harvego, E.A.; Coryell, E.W.; Davis, C.B.

    2002-01-01

    The importance of nuclear energy as a vital and strategic resource in the U. S. and world's energy supply mix has led to an initiative, termed Generation IV by the U.S. Department of Energy (DOE), to develop and demonstrate new and improved reactor technologies. These new Generation IV reactor concepts are expected to be substantially improved over the current generation of reactors with respect to economics, safety, proliferation resistance and waste characteristics. Although a number of light water reactor concepts have been proposed as Generation IV candidates, the majority of proposed designs have fundamentally different characteristics than the current generation of commercial LWRs operating in the U.S. and other countries. This paper presents the results of a review of these new reactor technologies and defines the transient analyses required to support the evaluation and future development of the Generation IV concepts. The ultimate objective of this work is to identify and develop new capabilities needed by INEEL to support DOE's Generation IV initiative. In particular, the focus of this study is on needed extensions or enhancements to SCDAP/RELAP5/3D code. This code and the RELAP5-3D code from which it evolved are the primary analysis tools used by the INEEL and others for the analysis of design-basis and beyond-design-basis accidents in current generation light water reactors. (authors)

  8. Non-Coding Transcript Heterogeneity in Mesothelioma: Insights from Asbestos-Exposed Mice.

    Science.gov (United States)

    Felley-Bosco, Emanuela; Rehrauer, Hubert

    2018-04-11

    Mesothelioma is an aggressive, rapidly fatal cancer and a better understanding of its molecular heterogeneity may help with making more efficient therapeutic strategies. Non-coding RNAs represent a larger part of the transcriptome but their contribution to diseases is not fully understood yet. We used recently obtained RNA-seq data from asbestos-exposed mice and performed data mining of publicly available datasets in order to evaluate how non-coding RNA contribute to mesothelioma heterogeneity. Nine non-coding RNAs are specifically elevated in mesothelioma tumors and contribute to human mesothelioma heterogeneity. Because some of them have known oncogenic properties, this study supports the concept of non-coding RNAs as cancer progenitor genes.

  9. A Practical View on Tunable Sparse Network Coding

    DEFF Research Database (Denmark)

    Sørensen, Chres Wiant; Shahbaz Badr, Arash; Cabrera Guerrero, Juan Alberto

    2015-01-01

    Tunable sparse network coding (TSNC) constitutes a promising concept for trading off computational complexity and delay performance. This paper advocates for the use of judicious feedback as a key not only to make TSNC practical, but also to deliver a highly consistent and controlled delay perfor...

  10. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    Science.gov (United States)

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  11. Predictive coding in Agency Detection

    DEFF Research Database (Denmark)

    Andersen, Marc Malmdorf

    2017-01-01

    Agency detection is a central concept in the cognitive science of religion (CSR). Experimental studies, however, have so far failed to lend support to some of the most common predictions that follow from current theories on agency detection. In this article, I argue that predictive coding, a highly...... promising new framework for understanding perception and action, may solve pending theoretical inconsistencies in agency detection research, account for the puzzling experimental findings mentioned above, and provide hypotheses for future experimental testing. Predictive coding explains how the brain......, unbeknownst to consciousness, engages in sophisticated Bayesian statistics in an effort to constantly predict the hidden causes of sensory input. My fundamental argument is that most false positives in agency detection can be seen as the result of top-down interference in a Bayesian system generating high...

  12. Integrated burnup calculation code system SWAT

    International Nuclear Information System (INIS)

    Suyama, Kenya; Hirakawa, Naohiro; Iwasaki, Tomohiko.

    1997-11-01

    SWAT is an integrated burnup code system developed for analysis of post irradiation examination, transmutation of radioactive waste, and burnup credit problem. It enables us to analyze the burnup problem using neutron spectrum depending on environment of irradiation, combining SRAC which is Japanese standard thermal reactor analysis code system and ORIGEN2 which is burnup code widely used all over the world. SWAT makes effective cross section library based on results by SRAC, and performs the burnup analysis with ORIGEN2 using that library. SRAC and ORIGEN2 can be called as external module. SWAT has original cross section library on based JENDL-3.2 and libraries of fission yield and decay data prepared from JNDC FP Library second version. Using these libraries, user can use latest data in the calculation of SWAT besides the effective cross section prepared by SRAC. Also, User can make original ORIGEN2 library using the output file of SWAT. This report presents concept and user's manual of SWAT. (author)

  13. Concept development for HLW disposal research tunnel

    International Nuclear Information System (INIS)

    Queon, S. K.; Kim, K. S.; Park, J. H.; Jeo, W. J.; Han, P. S.

    2003-01-01

    In order to dispose high-level radioactive waste in a geological formation, it is necessary to assess the safety of a disposal concept by excavating a research tunnel in the same geological formation as the host rock mass. The design concept of a research tunnel depends on the actual disposal concept, repository geometry, experiments to be carried at the tunnel, and geological conditions. In this study, analysis of the characteristics of the disposal research tunnel, which is planned to be constructed at KAERI site, calculation of the influence of basting impact on neighbor facilities, and computer simuation for mechanical stability analysis using a three-dimensional code, FLAC3D, had been carried out to develop the design concept of the research tunnel

  14. Applying Physical-Layer Network Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Liew SoungChang

    2010-01-01

    Full Text Available A main distinguishing feature of a wireless network compared with a wired network is its broadcast nature, in which the signal transmitted by a node may reach several other nodes, and a node may receive signals from several other nodes, simultaneously. Rather than a blessing, this feature is treated more as an interference-inducing nuisance in most wireless networks today (e.g., IEEE 802.11. This paper shows that the concept of network coding can be applied at the physical layer to turn the broadcast property into a capacity-boosting advantage in wireless ad hoc networks. Specifically, we propose a physical-layer network coding (PNC scheme to coordinate transmissions among nodes. In contrast to "straightforward" network coding which performs coding arithmetic on digital bit streams after they have been received, PNC makes use of the additive nature of simultaneously arriving electromagnetic (EM waves for equivalent coding operation. And in doing so, PNC can potentially achieve 100% and 50% throughput increases compared with traditional transmission and straightforward network coding, respectively, in 1D regular linear networks with multiple random flows. The throughput improvements are even larger in 2D regular networks: 200% and 100%, respectively.

  15. A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory

    Science.gov (United States)

    Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin

    2015-09-01

    Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.

  16. Code of practice against radiation hazards at PINSTECH

    International Nuclear Information System (INIS)

    Mubarak, M.A.; Javed, M.; Ahmad, S.

    1982-10-01

    It is the radiation safety policy of PAEC/PINSTECH that all radiation exposure should be kept as low as reasonably achievable (ALARA). A code of practice against radiation hazards at PINSTECH was written in 1972 which regulated the conduct of radiation work at PINSTECH. Since the radiation work at PINSTECH has greatly increased, it was considered necessary to revise the code so as to incorporate the new concepts in this field as well as to help meet the present requirements of radiation protection. The procedures set forth in this code are mandatory and in no case should any of them be deviated except under an emergency situation which may be handled according to procedures laid down in a separate manual ''Emergency Procedures at PARR-PINSTECH'' (PINSTECH/HP--19). All those supervising or performing any kind of radiation work are required to study and adhere to these procedures. Copy of this code should be kept in every radiation laboratory for ready reference. (author)

  17. SCORCH - a zero dimensional plasma evolution and transport code for use in small and large tokamak systems

    International Nuclear Information System (INIS)

    Clancy, B.E.; Cook, J.L.

    1984-12-01

    The zero-dimensional code SCORCH determines number density and temperature evolution in plasmas using concepts derived from the Hinton and Hazeltine transport theory. The code uses the previously reported ADL-1 data library

  18. An explication of the Graphite Structural Design Code of core components for the High Temperature Engineering Test Reactor

    International Nuclear Information System (INIS)

    Iyoku, Tatsuo; Ishihara, Masahiro; Toyota, Junji; Shiozawa, Shusaku

    1991-05-01

    The integrity evaluation of the core graphite components for the High Temperature Engineering Test Reactor (HTTR) will be carried out based upon the Graphite Structural Design Code for core components. In the application of this design code, it is necessary to make clear the basic concept to evaluate the integrity of core components of HTTR. Therefore, considering the detailed design of core graphite structures such as fuel graphite blocks, etc. of HTTR, this report explicates the design code in detail about the concepts of stress and fatigue limits, integrity evaluation method of oxidized graphite components and thermal irradiation stress analysis method etc. (author)

  19. Sparsity in Linear Predictive Coding of Speech

    DEFF Research Database (Denmark)

    Giacobello, Daniele

    of the effectiveness of their application in audio processing. The second part of the thesis deals with introducing sparsity directly in the linear prediction analysis-by-synthesis (LPAS) speech coding paradigm. We first propose a novel near-optimal method to look for a sparse approximate excitation using a compressed...... one with direct applications to coding but also consistent with the speech production model of voiced speech, where the excitation of the all-pole filter can be modeled as an impulse train, i.e., a sparse sequence. Introducing sparsity in the LP framework will also bring to de- velop the concept...... sensing formulation. Furthermore, we define a novel re-estimation procedure to adapt the predictor coefficients to the given sparse excitation, balancing the two representations in the context of speech coding. Finally, the advantages of the compact parametric representation of a segment of speech, given...

  20. Test of Effective Solid Angle code for the efficiency calculation of volume source

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    It is hard to determine a full energy (FE) absorption peak efficiency curve for an arbitrary volume source by experiment. That's why the simulation and semi-empirical methods have been preferred so far, and many works have progressed in various ways. Moens et al. determined the concept of effective solid angle by considering an attenuation effect of γ-rays in source, media and detector. This concept is based on a semi-empirical method. An Effective Solid Angle code (ESA code) has been developed for years by the Applied Nuclear Physics Group in Seoul National University. ESA code converts an experimental FE efficiency curve determined by using a standard point source to that for a volume source. To test the performance of ESA Code, we measured the point standard sources and voluminous certified reference material (CRM) sources of γ-ray, and compared with efficiency curves obtained in this study. 200∼1500 KeV energy region is fitted well. NIST X-ray mass attenuation coefficient data is used currently to check for the effect of linear attenuation only. We will use the interaction cross-section data obtained from XCOM code to check the each contributing factor like photoelectric effect, incoherent scattering and coherent scattering in the future. In order to minimize the calculation time and code simplification, optimization of algorithm is needed.

  1. Evaluating and integrating corporate social responsibility standards: Implications for CSR concepts

    Directory of Open Access Journals (Sweden)

    Markus Stiglbauer

    2012-03-01

    Full Text Available Standards play a major role when concepts of corporate social responsibility (CSR ought to be implemented and corporate social performance (CSP ought to be assessed. Ethical reasoning and stakeholders’ expectations help to measure companies’ intentions to implement CSR standards and to measure their efficiency. With different standards of CSR (company standards, industry standards, multi-stakeholder standards and independent standards companies may implement we categorize and еvaluate those standards and give advice which opportunities but also threats may arise for companies when implementing such codes within firm-specific CSR concepts. We suggest a combination of different standards and replenish them with firm-specific codes of conduct.

  2. Simulation of Water Chemistry using and Geochemistry Code, PHREEQE

    Energy Technology Data Exchange (ETDEWEB)

    Chi, J.H. [Korea Electric Power Research Institute, Taejeon (Korea)

    2001-07-01

    This report introduces principles and procedures of simulation for water chemistry using a geochemistry code, PHREEQE. As and example of the application of this code, we described the simulation procedure for titration of an aquatic sample with strong acid to investigate the state of Carbonates in aquatic solution. Major contents of this report are as follows; Concepts and principles of PHREEQE, Kinds of chemical reactions which may be properly simulated by PHREEQE, The definition and meaning of each input data, An example of simulation using PHREEQE. (author). 2 figs., 1 tab.

  3. Concept Evaluation for Hydraulic Yaw System

    DEFF Research Database (Denmark)

    Stubkier, Søren; Pedersen, Henrik C.; Andersen, Torben Ole

    2013-01-01

    The yaw system is the subsystem on a wind turbine which ensures that the rotor plane of the turbine always is facing the wind direction. Studies from [1] show that a soft yaw system may be utilized to dampen the loads in the wind turbine structure. The soft yaw system operates much like...... investigation. Loads and yaw demands are based on the IEC 61400-1 standard for wind turbine design, and the loads for this examination are extrapolated from the HAWC2 aeroelastic design code. The concepts are based on a 5 MW off-shore turbine....... a suspension system on a car, leading the loads away from the turbine structure. However, to realize a soft hydraulic yaw system a new design concept must be found. As a part of the development of the new concept a preliminary concept evaluation has been conducted, evaluating seven different hydraulic yaw...

  4. Perspective on the audit calculation for SFR using TRACE code

    Energy Technology Data Exchange (ETDEWEB)

    Shin, An Dong; Choi, Yong Won; Bang, Young Suk; Bae, Moo Hoon; Huh, Byung Gil; Seol, Kwang One [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    Korean Sodium Cooled Fast Reactor (SFR) is being developed by KAERI. The Prototype SFR will be a first SFR applied for licensing. KINS started research programs for preparing new concept design licensing recently. Safety analysis for the certain reactor is based on the computational estimation with conservatism and/or uncertainty of modeling. For the audit calculation for sodium cooled fast reactor (SFR), TRACE code is considered as one of analytical tool for SFR since TRACE code have already sodium related properties and models in it and have experience in the liquid metal coolant system area in abroad. Applicability of TRACE code for SFR is prechecked before real audit calculation. In this study, Demonstration Fast Reactor (DFR) 600 steady state conditions is simulated for identification of area of modeling improvements of TRACE code.

  5. Energy-Efficient Cluster Based Routing Protocol in Mobile Ad Hoc Networks Using Network Coding

    OpenAIRE

    Srinivas Kanakala; Venugopal Reddy Ananthula; Prashanthi Vempaty

    2014-01-01

    In mobile ad hoc networks, all nodes are energy constrained. In such situations, it is important to reduce energy consumption. In this paper, we consider the issues of energy efficient communication in MANETs using network coding. Network coding is an effective method to improve the performance of wireless networks. COPE protocol implements network coding concept to reduce number of transmissions by mixing the packets at intermediate nodes. We incorporate COPE into cluster based routing proto...

  6. An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

    Directory of Open Access Journals (Sweden)

    Tu Zhenyu

    2005-01-01

    Full Text Available A simple but powerful scheme exploiting the binning concept for asymmetric lossless distributed source coding is proposed. The novelty in the proposed scheme is the introduction of a syndrome former (SF in the source encoder and an inverse syndrome former (ISF in the source decoder to efficiently exploit an existing linear channel code without the need to modify the code structure or the decoding strategy. For most channel codes, the construction of SF-ISF pairs is a light task. For parallelly and serially concatenated codes and particularly parallel and serial turbo codes where this appear less obvious, an efficient way for constructing linear complexity SF-ISF pairs is demonstrated. It is shown that the proposed SF-ISF approach is simple, provenly optimal, and generally applicable to any linear channel code. Simulation using conventional and asymmetric turbo codes demonstrates a compression rate that is only 0.06 bit/symbol from the theoretical limit, which is among the best results reported so far.

  7. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Smith, A.B. [ed.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  8. THEORETICAL AND PRACTICAL APPROACHES REGARDING THE ADOPTION OF CORPORATE GOVERNANCE CODES

    OpenAIRE

    Sorin Nicolae Borlea; Monica-Violeta Achim; Ludovica Breban

    2013-01-01

    In the European Union, the concept of corporate governance began to emerge more clearly after 1997, when most countries have however, voluntarily adopted corporate governance codes. The impulse of adopting these codes consists in the financial scandals related to the failure of the British companies listed on the stock exchange. Numerous scandals involving big companies such as Enron, WorldCom, Parmalat, Xerox, Merrill Lynch, Andersen and so on, conduct to a lack of investors’ confidence. ...

  9. Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography

    Science.gov (United States)

    Aydin, Nuh

    2009-01-01

    The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…

  10. Special issue on network coding

    Science.gov (United States)

    Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly

    2017-12-01

    Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.

  11. Quantum computation with topological codes from qubit to topological fault-tolerance

    CERN Document Server

    Fujii, Keisuke

    2015-01-01

    This book presents a self-consistent review of quantum computation with topological quantum codes. The book covers everything required to understand topological fault-tolerant quantum computation, ranging from the definition of the surface code to topological quantum error correction and topological fault-tolerant operations. The underlying basic concepts and powerful tools, such as universal quantum computation, quantum algorithms, stabilizer formalism, and measurement-based quantum computation, are also introduced in a self-consistent way. The interdisciplinary fields between quantum information and other fields of physics such as condensed matter physics and statistical physics are also explored in terms of the topological quantum codes. This book thus provides the first comprehensive description of the whole picture of topological quantum codes and quantum computation with them.

  12. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-01-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  13. Innovation and Standardization in School Building: A Proposal for the National Code in Italy.

    Science.gov (United States)

    Ridolfi, Giuseppe

    This document discusses the University of Florence's experience and concepts as it developed the research to define a proposal for designing a new national school building code. Section 1 examines the current school building code and the Italian Reform Process in Education between 1960 and 2000. Section 2 details and explains the new school…

  14. Hello Ruby adventures in coding

    CERN Document Server

    Liukas, Linda

    2015-01-01

    "Code is the 21st century literacy and the need for people to speak the ABCs of Programming is imminent." --Linda Liukas Meet Ruby--a small girl with a huge imagination. In Ruby's world anything is possible if you put your mind to it. When her dad asks her to find five hidden gems Ruby is determined to solve the puzzle with the help of her new friends, including the Wise Snow Leopard, the Friendly Foxes, and the Messy Robots. As Ruby stomps around her world kids will be introduced to the basic concepts behind coding and programming through storytelling. Learn how to break big problems into small problems, repeat tasks, look for patterns, create step-by-step plans, and think outside the box. With hands-on activities included in every chapter, future coders will be thrilled to put their own imaginations to work.

  15. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  16. A GFR benchmark comparison of transient analysis codes based on the ETDR concept

    International Nuclear Information System (INIS)

    Bubelis, E.; Coddington, P.; Castelliti, D.; Dor, I.; Fouillet, C.; Geus, E. de; Marshall, T.D.; Van Rooijen, W.; Schikorr, M.; Stainsby, R.

    2007-01-01

    A GFR (Gas-cooled Fast Reactor) transient benchmark study was performed to investigate the ability of different code systems to calculate the transition in the core heat removal from the main circuit forced flow to natural circulation cooling using the Decay Heat Removal (DHR) system. This benchmark is based on a main blower failure in the Experimental Technology Demonstration Reactor (ETDR) with reactor scram. The codes taking part into the benchmark are: RELAP5, TRAC/AAA, CATHARE, SIM-ADS, MANTA and SPECTRA. For comparison purposes the benchmark was divided into several stages: the initial steady-state solution, the main blower flow run-down, the opening of the DHR loop and the transition to natural circulation and finally the 'quasi' steady heat removal from the core by the DHR system. The results submitted by the participants showed that all the codes gave consistent results for all four stages of the benchmark. In the steady-state the calculations revealed some differences in the clad and fuel temperatures, the core and main loop pressure drops and in the total Helium mass inventory. Also some disagreements were observed in the Helium and water flow rates in the DHR loop during the final natural circulation stage. Good agreement was observed for the total main blower flow rate and Helium temperature rise in the core, as well as for the Helium inlet temperature into the core. In order to understand the reason for the differences in the initial 'blind' calculations a second round of calculations was performed using a more precise set of boundary conditions

  17. Application of the NJOY code for unresolved resonance treatment in the MCNP utility code

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J. . E-mail addresses of corresponding authors: mmilos@vin.bg.ac.yu , vujic@nuc.berkeley.edu ,; Milosevic, M.; Vujic, J.)

    2005-01-01

    There are numerous uncertainties in the prediction of neutronic characteristics of reactor cores, particularly in the case of innovative reactor designs, arising from approximations used in the solution of the transport equation, and in nuclear data processing and cross section libraries generation. This paper describes the problems encountered in the analysis of the Encapsulated Nuclear Heat Source (ENHS) benchmark core and the new procedures and cross section libraries developed to overcome these problems. The ENHS is a new lead-bismuth or lead cooled novel reactor concept that is fuelled with metallic alloy of Pu, U and Zr, and it is designed to operate for 20 effective full power years without refuelling and with very small burnup reactivity swing. The computational tools benchmarked include: MOCUP - a coupled MCNP-4C and ORIGEN2.1 utility codes with MCNP data libraries based on the ENDF/B-VI evaluations; and KWO2 - a coupled KENO-V.a and ORIGEN2.1 code with ENDFB-V.2 based 238 group library. Calculations made for the ENHS benchmark have shown that the differences between the results obtained using different code systems and cross section libraries are significant and should be taken into account in assessing the quality of nuclear data libraries. (author)

  18. Integrated Validation System for a Thermal-hydraulic System Code, TASS/SMR-S

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee-Kyung; Kim, Hyungjun; Kim, Soo Hyoung; Hwang, Young-Dong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Hyeon-Soo [Chungnam National University, Daejeon (Korea, Republic of)

    2015-10-15

    Development including enhancement and modification of thermal-hydraulic system computer code is indispensable to a new reactor, SMART. Usually, a thermal-hydraulic system code validation is achieved by a comparison with the results of corresponding physical effect tests. In the reactor safety field, a similar concept, referred to as separate effect tests has been used for a long time. But there are so many test data for comparison because a lot of separate effect tests and integral effect tests are required for a code validation. It is not easy to a code developer to validate a computer code whenever a code modification is occurred. IVS produces graphs which shown the comparison the code calculation results with the corresponding test results automatically. IVS was developed for a validation of TASS/SMR-S code. The code validation could be achieved by a comparison code calculation results with corresponding test results. This comparison was represented as a graph for convenience. IVS is useful before release a new code version. The code developer can validate code result easily using IVS. Even during code development, IVS could be used for validation of code modification. The code developer could gain a confidence about his code modification easily and fast and could be free from tedious and long validation work. The popular software introduced in IVS supplies better usability and portability.

  19. Concept of APDL, the atomic process description language

    International Nuclear Information System (INIS)

    Sasaki, Akira

    2004-01-01

    The concept of APDL, the Atomic Process Description Language, which provides simple and complete description of atomic model is presented. The syntax to describe electron orbital and configuration is defined for the use in the atomic structure, kinetics and spectral synthesis simulation codes. (author)

  20. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  1. Lattices applied to coding for reliable and secure communications

    CERN Document Server

    Costa, Sueli I R; Campello, Antonio; Belfiore, Jean-Claude; Viterbo, Emanuele

    2017-01-01

    This book provides a first course on lattices – mathematical objects pertaining to the realm of discrete geometry, which are of interest to mathematicians for their structure and, at the same time, are used by electrical and computer engineers working on coding theory and cryptography. The book presents both fundamental concepts and a wealth of applications, including coding and transmission over Gaussian channels, techniques for obtaining lattices from finite prime fields and quadratic fields, constructions of spherical codes, and hard lattice problems used in cryptography. The topics selected are covered in a level of detail not usually found in reference books. As the range of applications of lattices continues to grow, this work will appeal to mathematicians, electrical and computer engineers, and graduate or advanced undergraduate in these fields.

  2. SAFIRE: A systems analysis code for ICF [inertial confinement fusion] reactor economics

    International Nuclear Information System (INIS)

    McCarville, T.J.; Meier, W.R.; Carson, C.F.; Glasgow, B.B.

    1987-01-01

    The SAFIRE (Systems Analysis for ICF Reactor Economics) code incorporates analytical models for scaling the cost and performance of several inertial confinement fusion reactor concepts for electric power. The code allows us to vary design parameters (e.g., driver energy, chamber pulse rate, net electric power) and evaluate the resulting change in capital cost of power plant and the busbar cost of electricity. The SAFIRE code can be used to identify the most attractive operating space and to identify those design parameters with the greatest leverage for improving the economics of inertial confinement fusion electric power plants

  3. The metaethics of nursing codes of ethics and conduct.

    Science.gov (United States)

    Snelling, Paul C

    2016-10-01

    Nursing codes of ethics and conduct are features of professional practice across the world, and in the UK, the regulator has recently consulted on and published a new code. Initially part of a professionalising agenda, nursing codes have recently come to represent a managerialist and disciplinary agenda and nursing can no longer be regarded as a self-regulating profession. This paper argues that codes of ethics and codes of conduct are significantly different in form and function similar to the difference between ethics and law in everyday life. Some codes successfully integrate these two functions within the same document, while others, principally the UK Code, conflate them resulting in an ambiguous document unable to fulfil its functions effectively. The paper analyses the differences between ethical-codes and conduct-codes by discussing titles, authorship, level, scope for disagreement, consequences of transgression, language and finally and possibly most importantly agent-centeredness. It is argued that conduct-codes cannot require nurses to be compassionate because compassion involves an emotional response. The concept of kindness provides a plausible alternative for conduct-codes as it is possible to understand it solely in terms of acts. But if kindness is required in conduct-codes, investigation and possible censure follows from its absence. Using examples it is argued that there are at last five possible accounts of the absence of kindness. As well as being potentially problematic for disciplinary panels, difficulty in understanding the features of blameworthy absence of kindness may challenge UK nurses who, following a recently introduced revalidation procedure, are required to reflect on their practice in relation to The Code. It is concluded that closer attention to metaethical concerns by code writers will better support the functions of their issuing organisations. © 2016 John Wiley & Sons Ltd.

  4. Two modified versions of the speciation code PHREEQE for modelling macromolecule-proton/cation interaction

    International Nuclear Information System (INIS)

    Falck, W.E.

    1991-01-01

    There is a growing need to consider the influence of organic macromolecules on the speciation of ions in natural waters. It is recognized that a simple discrete ligand approach to the binding of protons/cations to organic macromolecules is not appropriate to represent heterogeneities of binding site distributions. A more realistic approach has been incorporated into the speciation code PHREEQE which retains the discrete ligand approach but modifies the binding intensities using an electrostatic (surface complexation) model. To allow for different conformations of natural organic material two alternative concepts have been incorporated: it is assumed that (a) the organic molecules form rigid, impenetrable spheres, and (b) the organic molecules form flat surfaces. The former concept will be more appropriate for molecules in the smaller size range, while the latter will be more representative for larger size molecules or organic surface coatings. The theoretical concept is discussed and the relevant changes to the standard PHREEQE code are explained. The modified codes are called PHREEQEO-RS and PHREEQEO-FS for the rigid-sphere and flat-surface models respectively. Improved output facilities for data transfer to other computers, e.g. the Macintosh, are introduced. Examples where the model is tested against literature data are shown and practical problems are discussed. Appendices contain listings of the modified subroutines GAMMA and PTOT, an example input file and an example command procedure to run the codes on VAX computers

  5. Contribution to the study of {sup 233}U production with MOX-ThPu fuel in PWR reactor. Transition scenarios towards Th/{sup 233}U iso-generating concepts in thermal spectrum. Development of the MURE fuel evolution code; Contribution a l'etude de la production d'{sup 233}U en combustible MOX-ThPu en reacteur a eau sous pression. Scenarios de transition vers des concepts isogenerateurs Th/{sup 233}U en spectre thermique. Developpement du code MURE d'evolution du combustible

    Energy Technology Data Exchange (ETDEWEB)

    Michel-Sendis, F

    2006-12-15

    If nuclear power is to provide a significant fraction of the growing world energy demand, only through the breeding concept can the development of sustainable nuclear energy become a reality. The study of such a transition, from present-day nuclear technologies to future breeding concepts is therefore pertinent. Among these future concepts, those using the thorium cycle Th/U-233 in a thermal neutron spectrum are of particular interest; molten-salt type thermal reactors would allow for breeding while requiring comparatively low initial inventories of U-233. The upstream production of U-233 can be obtained through the use of thorium-plutonium mixed oxide fuel in present-day light water reactors. This work presents, firstly, the development of the MURE evolution code system, a C++ object-oriented code that allows the study, through Monte Carlo (M.C.) simulation, of nuclear reactors and the evolution of their fuel under neutron irradiation. The M.C. methods are well-suited for the study of any reactor, whether it'd be an existing reactor using a new kind of fuel or a future concept altogether, the simulation is only dependent on nuclear data. Exact and complex geometries can be simulated and continuous energy particle transport is performed. MURE is an interface with MCNP, the well-known and validated transport code, that allows, among other functionalities, to simulate constant power and constant reactivity evolutions. Secondly, the study of MOX ThPu fuel in a conventional light water reactor (REP) is presented; it explores different plutonium concentrations and isotopic qualities in order to evaluate their safety characteristics. Simulation of their evolution allows us to quantify the production of U-233 at the end of burnup. Last, different french scenarios validating a possible transition towards a park of thermal Th/U-233 breeders, are presented. In these scenarios, U-233 is produced in ThPu moxed light water reactors. (author)

  6. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  7. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  8. COUPLED SIMULATION OF GAS COOLED FAST REACTOR FUEL ASSEMBLY WITH NESTLE CODE SYSTEM

    Directory of Open Access Journals (Sweden)

    Filip Osusky

    2018-05-01

    Full Text Available The paper is focused on coupled calculation of the Gas Cooled Fast Reactor. The proper modelling of coupled neutronics and thermal-hydraulics is the corner stone for future safety assessment of the control and emergency systems. Nowadays, the system and channel thermal-hydraulic codes are accepted by the national regulatory authorities in European Union for license purposes, therefore the code NESTLE was used for the simulation. The NESTLE code is a coupled multigroup neutron diffusion code with thermal-hydraulic sub-channel code. In the paper, the validation of NESTLE code 5.2.1 installation is presented. The processing of fuel assembly homogeneous parametric cross-section library for NESTLE code simulation is made by the sequence TRITON of SCALE code package system. The simulated case in the NESTLE code is one fuel assembly of GFR2400 concept with reflective boundary condition in radial direction and zero flux boundary condition in axial direction. The results of coupled calculation are presented and are consistent with the GFR2400 study of the GoFastR project.

  9. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  10. Inclusion of pressure and flow in the KITES MHD equilibrium code

    International Nuclear Information System (INIS)

    Raburn, Daniel; Fukuyama, Atsushi

    2013-01-01

    One of the simplest self-consistent models of a plasma is single-fluid magnetohydrodynamic (MHD) equilibrium with no bulk fluid flow under axisymmetry. However, both fluid flow and non-axisymmetric effects can significantly impact plasma equilibrium and confinement properties: in particular, fluid flow can produce profile pedestals, and non-axisymmetric effects can produce islands and stochastic regions. There exist a number of computational codes which are capable of calculating equilibria with arbitrary flow or with non-axisymmetric effects. Previously, a concept for a code to calculate MHD equilibria with flow in non-axisymmetric systems was presented, called the KITES (Kyoto ITerative Equilibrium Solver) code. Since then, many of the computational modules for the KITES code have been completed, and the work-in-progress KITES code has been used to calculate non-axisymmetric force-free equilibria. Additional computational modules are required to allow the KITES code to calculate equilibria with pressure and flow. Here, the authors report on the approaches used in developing these modules and provide a sample calculation with pressure. (author)

  11. Sensitivity Study of Regional TDC in MATRA-S code Using PSBT Benchmark Exercise

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Cha, Jeong Hun; Seo, Kyong Won; Kwon, Hyuk; Hwang, Dae Hyun

    2012-01-01

    In the sub-channel analysis code, the modeling of interchannel exchanges between adjacent sub-channels expressed as diversion cross flow, turbulent mixing and so on. The turbulent mixing in MATRA-S code is considered as TDC( β : thermal diffusion coefficient). The TDC becomes different according to the bundle, grid type, mixing vane, and so on. Generally, the thermal mixing test is conducted to optimize the TDC. In the OECD/NRC PSBT benchmark, the thermal mixing test was conducted and the optimized TDC was analyzed using MATRA-S code. It was shown that the exit temperature distribution of MATRA-S code was different from an experimental result even though the optimized TDC was applied to the code. In this study, concept of the regional TDC was introduced and sensitivity analysis of the regional TDC was presented

  12. Representation of ophthalmology concepts by electronic systems: adequacy of controlled medical terminologies.

    Science.gov (United States)

    Chiang, Michael F; Casper, Daniel S; Cimino, James J; Starren, Justin

    2005-02-01

    To assess the adequacy of 5 controlled medical terminologies (International Classification of Diseases 9, Clinical Modification [ICD9-CM]; Current Procedural Terminology 4 [CPT-4]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; Logical Identifiers, Names, and Codes [LOINC]; Medical Entities Dictionary [MED]) for representing concepts in ophthalmology. Noncomparative case series. Twenty complete ophthalmology case presentations were sequentially selected from a publicly available ophthalmology journal. Each of the 20 cases was parsed into discrete concepts, and each concept was classified along 2 axes: (1) diagnosis, finding, or procedure and (2) ophthalmic or medical concept. Electronic or paper browsers were used to assign a code for every concept in each of the 5 terminologies. Adequacy of assignment for each concept was scored on a 3-point scale. Findings from all 20 case presentations were combined and compared based on a coverage score, which was the average score for all concepts in that terminology. Adequacy of assignment for concepts in each terminology, based on a 3-point Likert scale (0, no match; 1, partial match; 2, complete match). Cases were parsed into 1603 concepts. SNOMED-CT had the highest mean overall coverage score (1.625+/-0.667), followed by MED (0.974+/-0.764), LOINC (0.781+/-0.929), ICD9-CM (0.280+/-0.619), and CPT-4 (0.082+/-0.337). SNOMED-CT also had higher coverage scores than any of the other terminologies for concepts in the diagnosis, finding, and procedure categories. Average coverage scores for ophthalmic concepts were lower than those for medical concepts. Controlled terminologies are required for electronic representation of ophthalmology data. SNOMED-CT had significantly higher content coverage than any other terminology in this study.

  13. Evaluation of the RELAP4/MOD6 thermal-hydraulic code

    International Nuclear Information System (INIS)

    Haigh, W.S.; Margolis, S.G.; Rice, R.E.

    1978-01-01

    The NRC RELAP4/MOD6 computer code was recently released to the public for use in thermal-hydraulic analysis. This code has a unique new capability permitting analysis of both the blowdown and reflood portions of a postulated pressurized water reactor (PWR) loss-of-coolant accident (LOCA). A principal code evaluation objective is to assess the accuracy of the code for computing LOCA behavior over a wide range of system sizes and scaling concepts. The scales of interest include all LOCA experiments and will ultimately encompass full-sized PWR systems for which no experiments or data are available. Quantitative assessment of the accuracy of the code when it is applied to large PWR systems is still in the future. With RELAP4/MOD6, however, a technique has been demonstrated for using results derived from small-scale blowdown and reflood experiments to predict the accuracy of calculations for similar experiments of significantly different scale or component size. This demonstration is considered a first step in establishing confidence levels for the accuracy of calculations of a postulated LOCA

  14. TRANSURANUS: A fuel rod analysis code ready for use

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Van de Laar, J [Commission of the European Communities, Karlsruhe (Germany). European Inst. for Transuranium Elements; Ott, C [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1994-12-31

    The basic concepts of fuel rod performance codes are discussed. The TRANSURANUS code developed at the Institute for Transuranium Elements, Karlsruhe (GE) is presented. It is a quasi two-dimensional (1{sub 1/2}-D) code designed for treatment of a whole fuel rod for any type of reactor and any situation. The fuel rods found in the majority of test- or power reactors can be analyzed for very different situations (normal, off-normal and accidental). The time scale of the problems to be treated may range from milliseconds to years. The TRANSURANUS code consists of a clearly defined mechanical/mathematical framework into which physical models can easily be incorporated. This framework has been extensively tested and the programming very clearly reflects this structure. The code is well structured and easy to understand. It has a comprehensive material data bank for different fuels, claddings, coolants and their properties. The code can be employed in a deterministic and a statistical version. It is written in standard FORTRAN 77. The code system includes: 2 preprocessor programs (MAKROH and AXORDER) for setting up new data cases; the post-processor URPLOT for plotting all important quantities as a function of the radius, the axial coordinate or the time; the post-processor URSTART evaluating statistical analyses. The TRANSURANUS code exhibits short running times. A new WINDOWS-based interactive interface is under development. The code is now in use in various European institutions and is available to all interested parties. 7 figs., 15 refs.

  15. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  16. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  17. An improved thermal model for the computer code NAIAD

    International Nuclear Information System (INIS)

    Rainbow, M.T.

    1982-12-01

    An improved thermal model, based on the concept of heat slabs, has been incorporated as an option into the thermal hydraulic computer code NAIAD. The heat slabs are one-dimensional thermal conduction models with temperature independent thermal properties which may be internal and/or external to the fluid. Thermal energy may be added to or removed from the fluid via heat slabs and passed across the external boundary of external heat slabs at a rate which is a linear function of the external surface temperatures. The code input for the new option has been restructured to simplify data preparation. A full description of current input requirements is presented

  18. High-performance speech recognition using consistency modeling

    Science.gov (United States)

    Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth

    1994-12-01

    The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.

  19. LACEwING: A New Moving Group Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Riedel, Adric R. [Department of Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States); Blunt, Sarah C.; Faherty, Jacqueline K. [Department of Astrophysics, American Museum of Natural History, New York, NY 10024 (United States); Lambrides, Erini L. [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States); Rice, Emily L. [Department of Engineering Science and Physics, The College of Staten Island, Staten Island, NY 10314 (United States); Cruz, Kelle L., E-mail: arr@astro.caltech.edu [Department of Physics and Astronomy, Hunter College, New York, NY 10065 (United States)

    2017-03-01

    We present a new nearby young moving group (NYMG) kinematic membership analysis code, LocAting Constituent mEmbers In Nearby Groups (LACEwING), a new Catalog of Suspected Nearby Young Stars, a new list of bona fide members of moving groups, and a kinematic traceback code. LACEwING is a convergence-style algorithm with carefully vetted membership statistics based on a large numerical simulation of the Solar Neighborhood. Given spatial and kinematic information on stars, LACEwING calculates membership probabilities in 13 NYMGs and three open clusters within 100 pc. In addition to describing the inputs, methods, and products of the code, we provide comparisons of LACEwING to other popular kinematic moving group membership identification codes. As a proof of concept, we use LACEwING to reconsider the membership of 930 stellar systems in the Solar Neighborhood (within 100 pc) that have reported measurable lithium equivalent widths. We quantify the evidence in support of a population of young stars not attached to any NYMGs, which is a possible sign of new as-yet-undiscovered groups or of a field population of young stars.

  20. Development of Safety Analysis Codes and Experimental Validation for a Very High Temperature Gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang, H. Oh, PhD; Cliff Davis; Richard Moore

    2004-11-01

    The very high temperature gas-cooled reactors (VHTGRs) are those concepts that have average coolant temperatures above 900 degrees C or operational fuel temperatures above 1250 degrees C. These concepts provide the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation and nuclear hydrogen generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperatures to support process heat applications, such as desalination and cogeneration, the VHTGR's higher temperatures are suitable for particular applications such as thermochemical hydrogen production. However, the high temperature operation can be detrimental to safety following a loss-of-coolant accident (LOCA) initiated by pipe breaks caused by seismic or other events. Following the loss of coolant through the break and coolant depressurization, air from the containment will enter the core by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structures and fuel. The oxidation will release heat and accelerate the heatup of the reactor core. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. The Idaho National Engineering and Environmental Laboratory (INEEL) has investigated this event for the past three years for the HTGR. However, the computer codes used, and in fact none of the world's computer codes, have been sufficiently developed and validated to reliably predict this event. New code development, improvement of the existing codes, and experimental validation are imperative to narrow the uncertaninty in the predictions of this type of accident. The objectives of this Korean/United States collaboration are to develop advanced computational methods for VHTGR safety analysis codes and to validate these computer codes.

  1. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  2. The global public good concept: a means of promoting good veterinary governance.

    Science.gov (United States)

    Eloit, M

    2012-08-01

    At the outset, the concept of a 'public good' was associated with economic policies. However, it has now evolved not only from a national to a global concept (global public good), but also from a concept applying solely to the production of goods to one encompassing societal issues (education, environment, etc.) and fundamental rights, including the right to health and food. Through their actions, Veterinary Services, as defined by the Terrestrial Animal Health Code (Terrestrial Code) of the World Organisation for Animal Health (OIE), help to improve animal health and reduce production losses. In this way they contribute directly and indirectly to food security and to safeguarding human health and economic resources. The organisation and operating procedures of Veterinary Services are therefore key to the efficient governance required to achieve these objectives. The OIE is a major player in global cooperation and governance in the fields of animal and public health through the implementation of its strategic standardisation mission and other programmes for the benefit of Veterinary Services and OIE Member Countries. Thus, the actions of Veterinary Services and the OIE deserve to be recognised as a global public good, backed by public investment to ensure that all Veterinary Services are in a position to apply the principles of good governance and to comply with the international standards for the quality of Veterinary Services set out in the OIE Terrestrial Code (Section 3 on Quality of Veterinary Services) and Aquatic Animal Health Code (Section 3 on Quality of Aquatic Animal Health Services).

  3. Thermal-hydraulic analysis code development and application to passive safety reactor at JAERI

    International Nuclear Information System (INIS)

    Araya, F.

    1995-01-01

    After a brief overview of safety assessment process, the author describes the LOCA analysis code system developed in JAERI. It comprises audit calculation code (WREM, WREM-J2, Japanese own code and BE codes (2D/3D, ICAP, ROSA). The codes are applied to development of Japanese passive safety reactor concept JPSR. Special attention is paid to the passive heat removal system and phenomena considered to occur under loss of heat sink event. Examples of LOCA analysis based on operation of JPSR for the cases of heat removal by upper RHR and heat removal from core to atmosphere are given. Experiments for multi-dimensional flow field in RPV and steam condensation in water pool are used for understanding the phenomena in passive safety reactors. The report is in a poster form only. 1 tab., 13 figs

  4. THE LEGAL STATUS OF PROFESSIONALS IN THE CONTEXT OF CHANGES BROUGHT BY THE NEW CIVIL CODE

    Directory of Open Access Journals (Sweden)

    OANA-CARMEN RĂVAŞ

    2014-12-01

    Full Text Available Adoption of the New Civil Code (NCC meant a "turning point" for the radical concept of the subjects participating in legal relations which, according to the Commercial Code (now repealed almost all provisions usually were traders. Currently, the Unification of Private Law, according to the monistic conception embraced by NCC there are a series of difficulties in the conceptual framework of "professionals", the "company" and, especially, the professional traders. Professional traders are individuals: the individual, authorized individual and family business. The legal status of the three categories of individuals falling into the category of professionals traders is regulated by Ordinance no. 44/2008, amended.

  5. Contribution to the study of {sup 233}U production with MOX-ThPu fuel in PWR reactor. Transition scenarios towards Th/{sup 233}U iso-generating concepts in thermal spectrum. Development of the MURE fuel evolution code; Contribution a l'etude de la production d'{sup 233}U en combustible MOX-ThPu en reacteur a eau sous pression. Scenarios de transition vers des concepts isogenerateurs Th/{sup 233}U en spectre thermique. Developpement du code MURE d'evolution du combustible

    Energy Technology Data Exchange (ETDEWEB)

    Michel-Sendis, F

    2006-12-15

    If nuclear power is to provide a significant fraction of the growing world energy demand, only through the breeding concept can the development of sustainable nuclear energy become a reality. The study of such a transition, from present-day nuclear technologies to future breeding concepts is therefore pertinent. Among these future concepts, those using the thorium cycle Th/U-233 in a thermal neutron spectrum are of particular interest; molten-salt type thermal reactors would allow for breeding while requiring comparatively low initial inventories of U-233. The upstream production of U-233 can be obtained through the use of thorium-plutonium mixed oxide fuel in present-day light water reactors. This work presents, firstly, the development of the MURE evolution code system, a C++ object-oriented code that allows the study, through Monte Carlo (M.C.) simulation, of nuclear reactors and the evolution of their fuel under neutron irradiation. The M.C. methods are well-suited for the study of any reactor, whether it'd be an existing reactor using a new kind of fuel or a future concept altogether, the simulation is only dependent on nuclear data. Exact and complex geometries can be simulated and continuous energy particle transport is performed. MURE is an interface with MCNP, the well-known and validated transport code, that allows, among other functionalities, to simulate constant power and constant reactivity evolutions. Secondly, the study of MOX ThPu fuel in a conventional light water reactor (REP) is presented; it explores different plutonium concentrations and isotopic qualities in order to evaluate their safety characteristics. Simulation of their evolution allows us to quantify the production of U-233 at the end of burnup. Last, different french scenarios validating a possible transition towards a park of thermal Th/U-233 breeders, are presented. In these scenarios, U-233 is produced in ThPu moxed light water reactors. (author)

  6. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  7. Challenges in assessing college students' conception of duality: the case of infinity

    Science.gov (United States)

    Babarinsa-Ochiedike, Grace Olutayo

    Interpreting students' views of infinity posits a challenge for researchers due to the dynamic nature of the conception. There is diversity and variation among students' process-object perceptions. The fluctuations between students' views however reveal an undeveloped duality conception. This study examined college students' conception of duality in understanding and representing infinity with the intent to design strategies that could guide researchers in categorizing students' views of infinity into different levels. Data for the study were collected from N=238 college students enrolled in Calculus sequence courses (Pre-Calculus, Calculus I through Calculus III) at one of the southwestern universities in the U.S. using self-report questionnaires and semi-structured individual task-based interviews. Data was triangulated using multiple measures analyzed by three independent experts using self-designed coding sheets to assess students' externalization of the duality conception of infinity. Results of this study reveal that college students' experiences in traditional Calculus sequence courses are not supportive of the development of duality conception. On the contrary, it strengthens the singularity perspective on fundamental ideas of mathematics such as infinity. The study also found that coding and assessing college students' conception of duality is a challenging and complex process due to the dynamic nature of the conception that is task-dependent and context-dependent. Practical significance of the study is that it helps to recognize misconceptions and starts addressing them so students will have a more comprehensive view of fundamental mathematical ideas as they progress through the Calculus coursework sequence. The developed duality concept development framework called Action-Process-Object-Duality (APOD) adapted from the APOS theory could guide educators and researchers as they engage in assessing students' conception of duality. The results of this study

  8. Concepts of happiness across time and cultures.

    Science.gov (United States)

    Oishi, Shigehiro; Graham, Jesse; Kesebir, Selin; Galinha, Iolanda Costa

    2013-05-01

    We explored cultural and historical variations in concepts of happiness. First, we analyzed the definitions of happiness in dictionaries from 30 nations to understand cultural similarities and differences in happiness concepts. Second, we analyzed the definition of happiness in Webster's dictionaries from 1850 to the present day to understand historical changes in American English. Third, we coded the State of the Union addresses given by U.S. presidents from 1790 to 2010. Finally, we investigated the appearance of the phrases happy nation versus happy person in Google's Ngram Viewer from 1800 to 2008. Across cultures and time, happiness was most frequently defined as good luck and favorable external conditions. However, in American English, this definition was replaced by definitions focused on favorable internal feeling states. Our findings highlight the value of a historical perspective in the study of psychological concepts.

  9. An Analysis of the Relationship between IFAC Code of Ethics and CPI

    Directory of Open Access Journals (Sweden)

    Ayşe İrem Keskin

    2015-11-01

    Full Text Available Abstract Code of ethics has become a significant concept as regards to the business world. That is why occupational organizations have developed their own codes of ethics over time. In this study, primarily the compatibility classification of the accounting code of ethics belonging to the IFAC (The International Federation of Accountants is carried out on the basis of the action plans assessing the levels of usage by the 175 IFAC national accounting organizations. It is determined as a result of the classification that 60,6% of the member organizations are applying the IFAC code in general, the rest 39,4% on the other hand, is not applying the code at all. With this classification, the hypothesis propounding that “The national accounting organizations in highly corrupt countries would be less likely to adopt the IFAC ethic code than those in very clean countries,” is tested using the “Corruption Perception Index-CPI” data. It is determined that the findings support this relevant hypothesis.          

  10. Working research codes into fluid dynamics education: a science gateway approach

    Science.gov (United States)

    Mason, Lachlan; Hetherington, James; O'Reilly, Martin; Yong, May; Jersakova, Radka; Grieve, Stuart; Perez-Suarez, David; Klapaukh, Roman; Craster, Richard V.; Matar, Omar K.

    2017-11-01

    Research codes are effective for illustrating complex concepts in educational fluid dynamics courses, compared to textbook examples, an interactive three-dimensional visualisation can bring a problem to life! Various barriers, however, prevent the adoption of research codes in teaching: codes are typically created for highly-specific `once-off' calculations and, as such, have no user interface and a steep learning curve. Moreover, a code may require access to high-performance computing resources that are not readily available in the classroom. This project allows academics to rapidly work research codes into their teaching via a minimalist `science gateway' framework. The gateway is a simple, yet flexible, web interface allowing students to construct and run simulations, as well as view and share their output. Behind the scenes, the common operations of job configuration, submission, monitoring and post-processing are customisable at the level of shell scripting. In this talk, we demonstrate the creation of an example teaching gateway connected to the Code BLUE fluid dynamics software. Student simulations can be run via a third-party cloud computing provider or a local high-performance cluster. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).

  11. Some aspects of grading Java code submissions in MOOCs

    Directory of Open Access Journals (Sweden)

    Sándor Király

    2017-07-01

    Full Text Available Recently, massive open online courses (MOOCs have been offering a new online approach in the field of distance learning and online education. A typical MOOC course consists of video lectures, reading material and easily accessible tests for students. For a computer programming course, it is important to provide interactive, dynamic, online coding exercises and more complex programming assignments for learners. It is expedient for the students to receive prompt feedback on their coding submissions. Although MOOC automated programme evaluation subsystem is capable of assessing source programme files that are in learning management systems, in MOOC systems there is a grader that is responsible for evaluating students’ assignments with the result that course staff would be required to assess thousands of programmes submitted by the participants of the course without the benefit of an automatic grader. This paper presents a new concept for grading programming submissions of students and improved techniques based on the Java unit testing framework that enables automatic grading of code chunks. Some examples are also given such as the creation of unique exercises by dynamically generating the parameters of the assignment in a MOOC programming course combined with the kind of coding style recognition to teach coding standards.

  12. Windows user-friendly code package development for operation of research reactors

    International Nuclear Information System (INIS)

    Hoang Anh Tuan

    1998-01-01

    The content of the project was to developed: 1. MS Windows interface to spectral codes like THERMOS, PEACO-COLLIS, GRACE and burn-up code. 2. MS Windows C-language burn-up diffusion hexagonal lattice code. The overall scope of the project was to develop a PC-based MS Windows code package for operation of Dalat research reactor. Various problems relating to neutronic physics like thermalization, resonance treatment, fast spectral treatment, change of isotopic concentration during burn-up time as well as burn-up distribution in the reactor core are considered in parallel to application of informatics technique. The developing process is a subject of the concept of user-friendly interface between end-users and the code package. High level input features through system of icon, menu, dialog box with regard to Common User Access (CUA) convention and sophisticated graphical output in MS Windows environment was used. The user-computer interface is also enhanced by using both keyboard and mouse, which creates a very natural manner for end-user. (author)

  13. Erasmus MC at CLEF eHealth 2016: Concept recognition and coding in French texts

    NARCIS (Netherlands)

    E.M. Van Mulligen (Erik M.); Z. Afzal (Zubair); S.A. Akhondi (Saber); D. Vo (Dang); J.A. Kors (Jan)

    2016-01-01

    textabstractWe participated in task 2 of the CLEF eHealth 2016 chal-lenge. Two subtasks were addressed: entity recognition and normalization in a corpus of French drug labels and Medline titles, and ICD-10 coding of French death certificates. For both subtasks we used a dictionary-based approach.

  14. Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes.

    Science.gov (United States)

    Ganesan, Kavita; Lloyd, Shane; Sarkar, Vikren

    2016-01-01

    The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases.

  15. Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes

    Directory of Open Access Journals (Sweden)

    Kavita Ganesan

    2016-01-01

    Full Text Available The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases.

  16. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream.

    Science.gov (United States)

    Martin, Chris B; Douglas, Danielle; Newsome, Rachel N; Man, Louisa Ly; Barense, Morgan D

    2018-02-02

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. © 2018, Martin et al.

  17. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream

    Science.gov (United States)

    Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY

    2018-01-01

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853

  18. The materials programme for the high-temperature gas-cooled reactor in the Federal Republic of Germany: Status of the development of high-temperature materials, integrity concept, and design codes

    International Nuclear Information System (INIS)

    Nickel, H.; Bodmann, E.; Seehafer, H.J.

    1990-01-01

    During the last 15 years, the research and development of materials for high temperature gas-cooled reactor (HTGR) applications in the Federal Republic of Germany have been concentrated on the qualification of high-temperature structural alloys. Such materials are required for heat exchanger components of advanced HTGRs supplying nuclear process heat in the temperature range between 750 deg. and 950 deg. C. The suitability of the candidate alloys for service in the HTGR has been established, and continuing research is aimed at verification of the integrity of components over the envisaged service lifetimes. The special features of the HTGR which provide a high degree of safety are the use of ceramics for the core construction and the low power density of the core. The reactor integrity concept which has been developed is based on these two characteristics. Previously, technical guidelines and design codes for nuclear plants were tailored exclusively to light water reactor systems. An extensive research project was therefore initiated which led to the formulation of the basic principles on which a high temperature design code can be based. (author)

  19. Concept Testing of a Simple Floating Offshore Vertical Axis Wind Turbine

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge

    2013-01-01

    The wind energy community is researching new concepts for deeper sea offshore wind turbines. One such concept is the DeepWind concept. The concept is being assessed in a EU-FP7 project, called DeepWind. Objectives of this project are to assess large size wind turbines (5-20MW) based on the concept...... varying wind and wave conditions, and to compare such behaviour with computer code calculations. The concept turbine was designed and constructed by the project task partners, and all parts were assembled and installed at sea in the Roskilde fjord right next to DTU Risø campus. The turbine is under....... One task in the project is to test a 1kW concept rotor (not a scaled down MW size rotor) partly under field conditions in a fjord in Denmark, partly in a water tank under controlled conditions in Netherlands. The objective of testing the 1kW concept turbine is to verify the dynamical behaviour under...

  20. A plug-in to Eclipse for VHDL source codes: functionalities

    Science.gov (United States)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  1. CONCEPT-5 user's manual. [Power plant costs

    Energy Technology Data Exchange (ETDEWEB)

    Hudson, C.R. II

    1979-01-01

    The CONCEPT computer code package was developed to provide conceptual capital cost estimates for nuclear-fueled and fossil-fired power plants. Cost estimates can be made as a function of plant type, size, location, and date of initial operation. The output includes a detailed breakdown of the estimate into direct and indirect costs similar to the accounting system described in document NUS--531. Cost models are currently provided in CONCEPT 5 for single- and multiunit pressurized-water reactors, boiling-water reactors, and cost-fired plants with and without flue gas desulfurization equipment.

  2. Benchmarking Reactor Systems Studies by Comparison of EU and Japanese System Code Results for Different DEMO Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Kemp, R.; Ward, D.J., E-mail: richard.kemp@ccfe.ac.uk [EURATOM/CCFE Association, Culham Centre for Fusion Energy, Abingdon (United Kingdom); Nakamura, M.; Tobita, K. [Japan Atomic Energy Agency, Rokkasho (Japan); Federici, G. [EFDA Garching, Max Plank Institut fur Plasmaphysik, Garching (Germany)

    2012-09-15

    Full text: Recent systems studies work within the Broader Approach framework has focussed on benchmarking the EU systems code PROCESS against the Japanese code TPC for conceptual DEMO designs. This paper describes benchmarking work for a conservative, pulsed DEMO and an advanced, steady-state, high-bootstrap fraction DEMO. The resulting former machine is an R{sub 0} = 10 m, a = 2.5 m, {beta}{sub N} < 2.0 device with no enhancement in energy confinement over IPB98. The latter machine is smaller (R{sub 0} = 8 m, a = 2.7 m), with {beta}{sub N} = 3.0, enhanced confinement, and high bootstrap fraction f{sub BS} = 0.8. These options were chosen to test the codes across a wide range of parameter space. While generally in good agreement, some of the code outputs differ. In particular, differences have been identified in the impurity radiation models and flux swing calculations. The global effects of these differences are described and approaches to identifying the best models, including future experiments, are discussed. Results of varying some of the assumptions underlying the modelling are also presented, demonstrating the sensitivity of the solutions to technological limitations and providing guidance for where further research could be focussed. (author)

  3. Image content authentication based on channel coding

    Science.gov (United States)

    Zhang, Fan; Xu, Lei

    2008-03-01

    The content authentication determines whether an image has been tampered or not, and if necessary, locate malicious alterations made on the image. Authentication on a still image or a video are motivated by recipient's interest, and its principle is that a receiver must be able to identify the source of this document reliably. Several techniques and concepts based on data hiding or steganography designed as a means for the image authentication. This paper presents a color image authentication algorithm based on convolution coding. The high bits of color digital image are coded by the convolution codes for the tamper detection and localization. The authentication messages are hidden in the low bits of image in order to keep the invisibility of authentication. All communications channels are subject to errors introduced because of additive Gaussian noise in their environment. Data perturbations cannot be eliminated but their effect can be minimized by the use of Forward Error Correction (FEC) techniques in the transmitted data stream and decoders in the receiving system that detect and correct bits in error. This paper presents a color image authentication algorithm based on convolution coding. The message of each pixel is convolution encoded with the encoder. After the process of parity check and block interleaving, the redundant bits are embedded in the image offset. The tamper can be detected and restored need not accessing the original image.

  4. Identifying difficult concepts in introductory programming

    OpenAIRE

    Humar, Klaudija

    2014-01-01

    In this diploma thesis we try to find the answer to why students experience difficulties in introductory programming. We ask ourselves what causes most problems while trying to understand concepts in introductory programming, generating code and designing algorithms. In the first section we introduce programming language Python as the first programming language being taught to students. We compare it with programming language Pascal and stress the advantages of Python that seem important ...

  5. An Analysis of the Global Code of Ethics for Tourism in the Context of Corporate Social Responsibility

    OpenAIRE

    Buzar Stipe

    2015-01-01

    The author analyzes the Global Code of Ethics for Tourism in the context of corporate social responsibility and the need for discussing this topic in ethical codes within the business and tourism sector. The text first offers an overview of the fundamental ethical concepts in business ethics and corporate social responsibility and briefly conceptualizes the relationship between these two fields. At the end, the author analyzes the content of the Global Code of Ethics for Tourism with emphasis...

  6. THEORETICAL AND PRACTICAL APPROACHES REGARDING THE ADOPTION OF CORPORATE GOVERNANCE CODES

    Directory of Open Access Journals (Sweden)

    Sorin Nicolae Borlea

    2013-09-01

    Full Text Available In the European Union, the concept of corporate governance began to emerge more clearly after 1997, when most countries have however, voluntarily adopted corporate governance codes. The impulse of adopting these codes consists in the financial scandals related to the failure of the British companies listed on the stock exchange. Numerous scandals involving big companies such as Enron, WorldCom, Parmalat, Xerox, Merrill Lynch, Andersen and so on, conduct to a lack of investors’ confidence. These crises that have started to alarm governments, supervisory authorities, companies, investors and even the general public because of the fragility of the corporate governance’s system, highlight the need to rethink its structures. The process of adapting the corporate governance provisions in order to ensure transparency, responsibility and fair treatment of shareholders has resulted in the development of Corporate Governance Principles by the Organization for Economic Cooperation and Development (OECD. In order to asses these principles, it has started to identify the common elements of codes, one the most effective practice models of governance. Once the benefits of corporate governance practices have been understood and assimilated by the developed country, the developing countries (also Romania have begun to adopt "the best practices" in corporate governance, especially because this need is acutely felt in the changes required by the transition to a market economy. Our article describes the origins of the corporate governance, the concept and evolution of the corporate governance code at an international level, European level and also at a Romanian level.

  7. Deontological aspects of the nursing profession: understanding the code of ethics

    Directory of Open Access Journals (Sweden)

    Terezinha Nunes da Silva

    Full Text Available ABSTRACT Objective: to investigate nursing professionals' understanding concerning the Code of Ethics; to assess the relevance of the Code of Ethics of the nursing profession and its use in practice; to identify how problem-solving is performed when facing ethical dilemmas in professional practice. Method: exploratory descriptive study, conducted with 34 (thirty-four nursing professionals from a teaching hospital in João Pessoa, PB - Brazil. Results: four thematic categories emerged: conception of professional ethics in nursing practice; interpretations of ethics in the practice of care; use of the Code of Ethics in the professional practice; strategies for solving ethical issues in the professional practice. Final considerations: some of the nursing professionals comprehend the meaning coherently; others have a limited comprehension, based on jargon. Therefore, a deeper understanding of the text contained in this code is necessary so that it can be applied into practice, aiming to provide a quality care that is, above all, ethical and legal.

  8. Deontological aspects of the nursing profession: understanding the code of ethics.

    Science.gov (United States)

    Silva, Terezinha Nunes da; Freire, Maria Eliane Moreira; Vasconcelos, Monica Ferreira de; Silva Junior, Sergio Vital da; Silva, Wilton José de Carvalho; Araújo, Patrícia da Silva; Eloy, Allan Victor Assis

    2018-01-01

    to investigate nursing professionals' understanding concerning the Code of Ethics; to assess the relevance of the Code of Ethics of the nursing profession and its use in practice; to identify how problem-solving is performed when facing ethical dilemmas in professional practice. exploratory descriptive study, conducted with 34 (thirty-four) nursing professionals from a teaching hospital in João Pessoa, PB - Brazil. four thematic categories emerged: conception of professional ethics in nursing practice; interpretations of ethics in the practice of care; use of the Code of Ethics in the professional practice; strategies for solving ethical issues in the professional practice. some of the nursing professionals comprehend the meaning coherently; others have a limited comprehension, based on jargon. Therefore, a deeper understanding of the text contained in this code is necessary so that it can be applied into practice, aiming to provide a quality care that is, above all, ethical and legal.

  9. System code improvements for modelling passive safety systems and their validation

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, Sebastian; Cron, Daniel von der; Schaffrath, Andreas [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    GRS has been developing the system code ATHLET over many years. Because ATHLET, among other codes, is widely used in nuclear licensing and supervisory procedures, it has to represent the current state of science and technology. New reactor concepts such as Generation III+ and IV reactors and SMR are using passive safety systems intensively. The simulation of passive safety systems with the GRS system code ATHLET is still a big challenge, because of non-defined operation points and self-setting operation conditions. Additionally, the driving forces of passive safety systems are smaller and uncertainties of parameters have a larger impact than for active systems. This paper addresses the code validation and qualification work of ATHLET on the example of slightly inclined horizontal heat exchangers, which are e. g. used as emergency condensers (e. g. in the KERENA and the CAREM) or as heat exchanger in the passive auxiliary feed water systems (PAFS) of the APR+.

  10. Linguistic coding deficits in foreign language learners.

    Science.gov (United States)

    Sparks, R; Ganschow, L; Pohlman, J

    1989-01-01

    As increasing numbers of colleges and universities require a foreign language for graduation in at least one of their degree programs, reports of students with difficulties in learning a second language are multiplying. Until recently, little research has been conducted to identify the nature of this problem. Recent attempts by the authors have focused upon subtle but ongoing language difficulties in these individuals as the source of their struggle to learn a foreign language. The present paper attempts to expand upon this concept by outlining a theoretical framework based upon a linguistic coding model that hypothesizes deficits in the processing of phonological, syntactic, and/or semantic information. Traditional psychoeducational assessment batteries of standardized intelligence and achievement tests generally are not sensitive to these linguistic coding deficits unless closely analyzed or, more often, used in conjunction with a more comprehensive language assessment battery. Students who have been waived from a foreign language requirement and their proposed type(s) of linguistic coding deficits are profiled. Tentative conclusions about the nature of these foreign language learning deficits are presented along with specific suggestions for tests to be used in psychoeducational evaluations.

  11. Opacity calculations for extreme physical systems: code RACHEL

    Science.gov (United States)

    Drska, Ladislav; Sinor, Milan

    1996-08-01

    Computer simulations of physical systems under extreme conditions (high density, temperature, etc.) require the availability of extensive sets of atomic data. This paper presents basic information on a self-consistent approach to calculations of radiative opacity, one of the key characteristics of such systems. After a short explanation of general concepts of the atomic physics of extreme systems, the structure of the opacity code RACHEL is discussed and some of its applications are presented.

  12. Steady-state and transient simulations of gas cooled reactor with the computer code CATHARE

    International Nuclear Information System (INIS)

    Tauveron, N.; Saez, M.; Marchand, M.; Chataing, T.; Geffraye, G.; Cherel, J. M.

    2003-01-01

    This work concerns the design and safety analysis of Gas Cooled Reactors. The CATHARE code is used to test the design and safety of two different concepts, a High Temperature Gas Reactor concept (HTGR) and a Gas Fast Reactor concept (GFR). Relative to the HTGR concept, three transient simulations are performed and described in this paper: loss of electrical load without turbomachine trip, 10 inch cold duct break, 10 inch cold duct break combined with a tube rupture of a cooling exchanger. A second step consists in modelling a GFR concept. A nominal steady state situation at a power of 600 MW is obtained and first transient simulations are carried out to study decay heat removal situations after primary loop depressurisation

  13. Just sustainability? Sustainability and social justice in professional codes of ethics for engineers.

    Science.gov (United States)

    Brauer, Cletus S

    2013-09-01

    Should environmental, social, and economic sustainability be of primary concern to engineers? Should social justice be among these concerns? Although the deterioration of our natural environment and the increase in social injustices are among today's most pressing and important issues, engineering codes of ethics and their paramountcy clause, which contains those values most important to engineering and to what it means to be an engineer, do not yet put either concept on a par with the safety, health, and welfare of the public. This paper addresses a recent proposal by Michelfelder and Jones (2011) to include sustainability in the paramountcy clause as a way of rectifying the current disregard for social justice issues in the engineering codes. That proposal builds on a certain notion of sustainability that includes social justice as one of its dimensions and claims that social justice is a necessary condition for sustainability, not vice versa. The relationship between these concepts is discussed, and the original proposal is rejected. Drawing on insights developed throughout the paper, some suggestions are made as to how one should address the different requirements that theory and practice demand of the value taxonomy of professional codes of ethics.

  14. Computationally Efficient Amplitude Modulated Sinusoidal Audio Coding using Frequency-Domain Linear Prediction

    DEFF Research Database (Denmark)

    Christensen, M. G.; Jensen, Søren Holdt

    2006-01-01

    A method for amplitude modulated sinusoidal audio coding is presented that has low complexity and low delay. This is based on a subband processing system, where, in each subband, the signal is modeled as an amplitude modulated sum of sinusoids. The envelopes are estimated using frequency......-domain linear prediction and the prediction coefficients are quantized. As a proof of concept, we evaluate different configurations in a subjective listening test, and this shows that the proposed method offers significant improvements in sinusoidal coding. Furthermore, the properties of the frequency...

  15. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  16. Error-Rate Bounds for Coded PPM on a Poisson Channel

    Science.gov (United States)

    Moision, Bruce; Hamkins, Jon

    2009-01-01

    Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.

  17. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  18. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  19. User's manual for the CC3 computer models of the concept for disposal of Canada's nuclear fuel waste

    International Nuclear Information System (INIS)

    Dougan, K.D.; Wojciechowski, L.C.

    1995-06-01

    Atomic Energy of Canada Limited (AECL) is assessing a concept for disposing of CANDU reactor fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept, and for environmental assessments of other systems. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system, which is represented by the CC3 (Canadian Concept, generation 3) models comprising a design-specific vault, a site-specific geosphere and a biosphere typical of the Canadian Shield. (author). 23 refs., 7 tabs., 21 figs

  20. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  1. An Analysis of the Global Code of Ethics for Tourism in the Context of Corporate Social Responsibility

    Directory of Open Access Journals (Sweden)

    Buzar Stipe

    2015-12-01

    Full Text Available The author analyzes the Global Code of Ethics for Tourism in the context of corporate social responsibility and the need for discussing this topic in ethical codes within the business and tourism sector. The text first offers an overview of the fundamental ethical concepts in business ethics and corporate social responsibility and briefly conceptualizes the relationship between these two fields. At the end, the author analyzes the content of the Global Code of Ethics for Tourism with emphasis on the elements pertaining to corporate social responsibility, after which he offers a critical opinion about the contribution of the aforemntioned code.

  2. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  3. Beacon- and Schema-Based Method for Recognizing Algorithms from Students' Source Code

    Science.gov (United States)

    Taherkhani, Ahmad; Malmi, Lauri

    2013-01-01

    In this paper, we present a method for recognizing algorithms from students programming submissions coded in Java. The method is based on the concept of "programming schemas" and "beacons". Schemas are high-level programming knowledge with detailed knowledge abstracted out, and beacons are statements that imply specific…

  4. Development Perspective of Regulatory Audit Code System for SFR Nuclear Safety Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Moo Hoon; Lee, Gil Soo; Shin, An Dong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-05-15

    A sodium-cooled fast reactor (SFR) in Korea is based on the KALIMER-600 concept developed by KAERI. Based on 'Long-term R and D Plan for Future Reactor Systems' which was approved by the Korea Atomic Energy Commission in 2008, the KAERI designer is scheduled to apply the design certification of the prototype SFR in 2017. In order to establish regulatory infrastructure for the licensing of a prototype SFR, KINS has develop the regulatory requirements for the demonstration SFR since 2010, and are scheduled to develop the regulatory audit code systems in regard to core, fuel, and system, etc. since 2012. In this study, the domestic code systems used for core design and safety evaluation of PWRs and the nuclear physics and code system for SFRs were briefly reviewed, and the development perspective of regulatory audit code system for SFR nuclear safety evaluation were derived

  5. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  6. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  7. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  8. Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains

    Science.gov (United States)

    Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao

    2017-11-01

    A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.

  9. The theta/gamma discrete phase code occuring during the hippocampal phase precession may be a more general brain coding scheme.

    Science.gov (United States)

    Lisman, John

    2005-01-01

    In the hippocampus, oscillations in the theta and gamma frequency range occur together and interact in several ways, indicating that they are part of a common functional system. It is argued that these oscillations form a coding scheme that is used in the hippocampus to organize the readout from long-term memory of the discrete sequence of upcoming places, as cued by current position. This readout of place cells has been analyzed in several ways. First, plots of the theta phase of spikes vs. position on a track show a systematic progression of phase as rats run through a place field. This is termed the phase precession. Second, two cells with nearby place fields have a systematic difference in phase, as indicated by a cross-correlation having a peak with a temporal offset that is a significant fraction of a theta cycle. Third, several different decoding algorithms demonstrate the information content of theta phase in predicting the animal's position. It appears that small phase differences corresponding to jitter within a gamma cycle do not carry information. This evidence, together with the finding that principle cells fire preferentially at a given gamma phase, supports the concept of theta/gamma coding: a given place is encoded by the spatial pattern of neurons that fire in a given gamma cycle (the exact timing within a gamma cycle being unimportant); sequential places are encoded in sequential gamma subcycles of the theta cycle (i.e., with different discrete theta phase). It appears that this general form of coding is not restricted to readout of information from long-term memory in the hippocampus because similar patterns of theta/gamma oscillations have been observed in multiple brain regions, including regions involved in working memory and sensory integration. It is suggested that dual oscillations serve a general function: the encoding of multiple units of information (items) in a way that preserves their serial order. The relationship of such coding to

  10. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  11. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  12. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  13. Radiological impact assessment in Malaysia using RESRAD computer code

    International Nuclear Information System (INIS)

    Syed Hakimi Sakuma Syed Ahmad; Khairuddin Mohamad Kontol; Razali Hamzah

    1999-01-01

    Radiological Impact Assessment (RIA) can be conducted in Malaysia by using the RESRAD computer code developed by Argonne National Laboratory, U.S.A. The code can do analysis to derive site specific guidelines for allowable residual concentrations of radionuclides in soil. Concepts of the RIA in the context of waste management concern in Malaysia, some regulatory information and assess status of data collection are shown. Appropriate use scenarios and site specific parameters are used as much as possible so as to be realistic so that will reasonably ensure that individual dose limits and or constraints will be achieved. Case study have been conducted to fulfil Atomic Energy Licensing Board (AELB) requirements where for disposal purpose the operator must be required to carry out. a radiological impact assessment to all proposed disposals. This is to demonstrate that no member of public will be exposed to more than 1 mSv/year from all activities. Results obtained from analyses show the RESRAD computer code is able to calculate doses, risks, and guideline values. Sensitivity analysis by the computer code shows that the parameters used as input are justified so as to improve confidence to the public and the AELB the results of the analysis. The computer code can also be used as an initial assessment to conduct screening assessment in order to determine a proper disposal site. (Author)

  14. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  15. Codification of LMFBR rules and comparison of codes

    International Nuclear Information System (INIS)

    Faure, O.; Debaene, J.P.

    1993-01-01

    The first part of this report presents the basic RCC-MR (regles de conception et de construction des materiels mecaniques des ilots nucleaires, reacteurs a neutrons rapides) design rules and their purpose. The second part is a qualitative comparison between RCC-MR, Code case N47 (ASME) and ETSDG Guide (MONJU Guide), made on the following topics: negligible creep test, ratcheting, creep fatigue, buckling, piping rules. An outline is given on improvements to RCC-MR rules now in progress

  16. Review of Rateless-Network-Coding-Based Packet Protection in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    A. S. Abdullah

    2015-01-01

    Full Text Available In recent times, there have been many developments in wireless sensor network (WSN technologies using coding theory. Fast and efficient protection schemes for data transfer over the WSN are some of the issues in coding theory. This paper reviews the issues related to the application of the joint rateless-network coding (RNC within the WSN in the context of packet protection. The RNC is a method in which any node in the network is allowed to encode and decode the transmitted data in order to construct a robust network, improve network throughput, and decrease delays. To the best of our knowledge, there has been no comprehensive discussion about RNC. To begin with, this paper briefly describes the concept of packet protection using network coding and rateless codes. We therefore discuss the applications of RNC for improving the capability of packet protection. Several works related to this issue are discussed. Finally, the paper concludes that the RNC-based packet protection scheme is able to improve the packet reception rate and suggests future studies to enhance the capability of RNC protection.

  17. Development and application of a system analysis code for liquid fueled molten salt reactors based on RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Chengbin [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Cheng, Maosong, E-mail: mscheng@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Liu, Guimin [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China)

    2016-08-15

    Highlights: • New point kinetics and thermo-hydraulics models as well as a numerical method are added into RELAP5 code to be suitable for liquid fueled molten salt reactor. • The extended REALP5 code is verified by the experimental benchmarks of MSRE. • The different transient scenarios of the MSBR are simulated to evaluate performance during the transients. - Abstract: The molten salt reactor (MSR) is one of the six advanced reactor concepts declared by the Generation IV International Forum (GIF), which can be characterized by attractive attributes as inherent safety, economical efficiency, natural resource protection, sustainable development and nuclear non-proliferation. It is important to make system safety analysis for nuclear power plant of MSR. In this paper, in order to developing a system analysis code suitable for liquid fueled molten salt reactors, the point kinetics and thermo-hydraulic models as well as the numerical method in thermal–hydraulic transient code Reactor Excursion and Leak Analysis Program (RELAP5) developed at the Idaho National Engineering Laboratory (INEL) for the U.S. Nuclear Regulatory Commission (NRC) are extended and verified by Molten Salt Reactor Experiment (MSRE) experimental benchmarks. And then, four transient scenarios including the load demand change, the primary flow transient, the secondary flow transient and the reactivity transient of the Molten Salt Breeder Reactor (MSBR) are modeled and simulated so as to evaluate the performance of the reactor during the anticipated transient events using the extended RELAP5 code. The results indicate the extended RELAP5 code is effective and well suited to the liquid fueled molten salt reactor, and the MSBR has strong inherent safety characteristics because of its large negative reactivity coefficient. In the future, the extended RELAP5 code will be used to perform transient safety analysis for a liquid fueled thorium molten salt reactor named TMSR-LF developed by the Center

  18. Development and application of a system analysis code for liquid fueled molten salt reactors based on RELAP5 code

    International Nuclear Information System (INIS)

    Shi, Chengbin; Cheng, Maosong; Liu, Guimin

    2016-01-01

    Highlights: • New point kinetics and thermo-hydraulics models as well as a numerical method are added into RELAP5 code to be suitable for liquid fueled molten salt reactor. • The extended REALP5 code is verified by the experimental benchmarks of MSRE. • The different transient scenarios of the MSBR are simulated to evaluate performance during the transients. - Abstract: The molten salt reactor (MSR) is one of the six advanced reactor concepts declared by the Generation IV International Forum (GIF), which can be characterized by attractive attributes as inherent safety, economical efficiency, natural resource protection, sustainable development and nuclear non-proliferation. It is important to make system safety analysis for nuclear power plant of MSR. In this paper, in order to developing a system analysis code suitable for liquid fueled molten salt reactors, the point kinetics and thermo-hydraulic models as well as the numerical method in thermal–hydraulic transient code Reactor Excursion and Leak Analysis Program (RELAP5) developed at the Idaho National Engineering Laboratory (INEL) for the U.S. Nuclear Regulatory Commission (NRC) are extended and verified by Molten Salt Reactor Experiment (MSRE) experimental benchmarks. And then, four transient scenarios including the load demand change, the primary flow transient, the secondary flow transient and the reactivity transient of the Molten Salt Breeder Reactor (MSBR) are modeled and simulated so as to evaluate the performance of the reactor during the anticipated transient events using the extended RELAP5 code. The results indicate the extended RELAP5 code is effective and well suited to the liquid fueled molten salt reactor, and the MSBR has strong inherent safety characteristics because of its large negative reactivity coefficient. In the future, the extended RELAP5 code will be used to perform transient safety analysis for a liquid fueled thorium molten salt reactor named TMSR-LF developed by the Center

  19. Development of Integrated Code for Risk Assessment (INCORIA) for Physical Protection System

    International Nuclear Information System (INIS)

    Jang, Sung Soon; Seo, Hyung Min; Yoo, Ho Sik

    2010-01-01

    A physical protection system (PPS) integrates people, procedures and equipment for the protection of assets or facilities against theft, sabotage or other malevolent human attacks. Among critical facilities, nuclear facilities and nuclear weapon sites require the highest level of PPS. After the September 11, 2001 terrorist attacks, international communities, including the IAEA, have made substantial efforts to protect nuclear material and nuclear facilities. The international flow on nuclear security is using the concept or risk assessment. The concept of risk assessment is firstly devised by nuclear safety people. They considered nuclear safety including its possible risk, which is the frequency of failure and possible consequence. Nuclear security people also considers security risk, which is the frequency of threat action, vulnerability, and consequences. The concept means that we should protect more when the credible threat exists and the possible radiological consequence is high. Even if there are several risk assessment methods of nuclear security, the application needs the help of tools because of a lot of calculation. It's also hard to find tools for whole phase of risk assessment. Several codes exist for the part of risk assessment. SAVI are used for vulnerability of PPS. Vital area identification code is used for consequence analysis. We are developing Integrated Code for Risk Assessment (INCORIA) to apply risk assessment methods for nuclear facilities. INCORIA evaluates PP-KINAC measures and generation tools for threat scenario. PP-KINAC is risk assessment measures for physical protection system developed by Hosik Yoo and is easy to apply. A threat scenario tool is used to generate threat scenario, which is used as one of input value to PP-KINAC measures

  20. Galen-In-Use: using artificial intelligence terminology tools to improve the linguistic coherence of a national coding system for surgical procedures.

    Science.gov (United States)

    Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F

    1998-01-01

    GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.

  1. Fast algorithm for two-dimensional data table use in hydrodynamic and radiative-transfer codes

    International Nuclear Information System (INIS)

    Slattery, W.L.; Spangenberg, W.H.

    1982-01-01

    A fast algorithm for finding interpolated atomic data in irregular two-dimensional tables with differing materials is described. The algorithm is tested in a hydrodynamic/radiative transfer code and shown to be of comparable speed to interpolation in regularly spaced tables, which require no table search. The concepts presented are expected to have application in any situation with irregular vector lengths. Also, the procedures that were rejected either because they were too slow or because they involved too much assembly coding are described

  2. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  3. Evaluation of SPACE code for simulation of inadvertent opening of spray valve in Shin Kori unit 1

    International Nuclear Information System (INIS)

    Kim, Seyun; Youn, Bumsoo

    2013-01-01

    SPACE code is expected to be applied to the safety analysis for LOCA (Loss of Coolant Accident) and Non-LOCA scenarios. SPACE code solves two-fluid, three-field governing equations and programmed with C++ computer language using object-oriented concepts. To evaluate the analysis capability for the transient phenomena in the actual nuclear power plant, an inadvertent opening of spray valve in startup test phase of Shin Kori unit 1 was simulated with SPACE code. To evaluate the analysis capability for the transient phenomena in the actual nuclear power plant, an inadvertent opening of spray valve in startup test phase of Shin Kori unit 1 was simulated with SPACE code

  4. How a modified approach to dental coding can benefit personal and professional development with improved clinical outcomes.

    Science.gov (United States)

    Lam, Raymond; Kruger, Estie; Tennant, Marc

    2014-12-01

    One disadvantage of the remarkable achievements in dentistry is that treatment options have never been more varied or confusing. This has made the concept of Evidenced Based Dentistry more applicable to modern dental practice. Despite merit in the concept whereby clinical decisions are guided by scientific evidence, there are problems with establishing a scientific base. This is no more challenging than in modern dentistry where the gap between rapidly developing products/procedures and its evidence base are widening. Furthermore, the burden of oral disease continues to remain high at the population level. These problems have prompted new approaches to enhancing research. The aim of this paper is to outline how a modified approach to dental coding may benefit clinical and population level research. Using publically assessable data obtained from the Australian Chronic Disease Dental Scheme and item codes contained within the Australian Schedule of Dental Services and Glossary, a suggested approach to dental informatics is illustrated. A selection of item codes have been selected and expanded with the addition of suffixes. These suffixes provided circumstantial information that will assist in assessing clinical outcomes such as success rates and prognosis. The use of item codes in administering the CDDS yielded a large database of item codes. These codes are amenable to dental informatics which has been shown to enhance research at both the clinical and population level. This is a cost effective method to supplement existing research methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Application of neutron/gamma transport codes for the design of explosive detection systems

    International Nuclear Information System (INIS)

    Elias, E.; Shayer, Z.

    1994-01-01

    Applications of neutron and gamma transport codes to the design of nuclear techniques for detecting concealed explosives material are discussed. The methodology of integrating radiation transport computations in the development, optimization and analysis phases of these new technologies is discussed. Transport and Monte Carlo codes are used for proof of concepts, guide the system integration, reduce the extend of experimental program and provide insight into the physical problem involved. The paper concentrates on detection techniques based on thermal and fast neutron interactions in the interrogated object. (authors). 6 refs., 1 tab., 5 figs

  6. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  7. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  8. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  9. The leadership concept in Iranian nursing.

    Science.gov (United States)

    Memarian, R; Ahmadi, F; Vaismoradi, M

    2008-03-01

    Although greater emphasis has been placed on leadership skills in nursing management in the last decade, the concepts are often confused or used erroneously by Iranian nurses. At the same time we have observed that wide variations in nurses' clinical practice appeared to be related to the presence or absence of leadership skills among senior nurses. To begin to identify the concepts used for expressing leadership in nursing within the Iranian cultural context. A qualitative approach was adopted using content analysis of semi-structured interviews carried out with 10 nurse managers from hospitals in Teheran. The data were analysed using the constant comparative method. Fifty-five primary codes were identified from the respondents' experiences and from these three main themes were abstracted for describing the leadership concept. These were 'personality traits', 'being a model', and 'being a spiritual guide for the nursing profession'. Implementing the culture of patient safety and dignity needs leadership. From Iranian nurse managers' perspectives a leader as a spiritual guide should empower nurses spiritually; it means he/she has a vision for nursing; has clear and explicit objectives; and has a commitment to nursing. Nurses who are confident about the underlying concepts of leadership in their culture can help to adapt nursing to an ever-changing healthcare environment.

  10. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  11. A UML profile for code generation of component based distributed systems

    International Nuclear Information System (INIS)

    Chiozzi, G.; Karban, R.; Andolfato, L.; Tejeda, A.

    2012-01-01

    A consistent and unambiguous implementation of code generation (model to text transformation) from UML (must rely on a well defined UML (Unified Modelling Language) profile, customizing UML for a particular application domain. Such a profile must have a solid foundation in a formally correct ontology, formalizing the concepts and their relations in the specific domain, in order to avoid a maze or set of wildly created stereotypes. The paper describes a generic profile for the code generation of component based distributed systems for control applications, the process to distill the ontology and define the profile, and the strategy followed to implement the code generator. The main steps that take place iteratively include: defining the terms and relations with an ontology, mapping the ontology to the appropriate UML meta-classes, testing the profile by creating modelling examples, and generating the code. This has allowed us to work on the modelling of E-ELT (European Extremely Large Telescope) control system and instrumentation without knowing what infrastructure will be finally used

  12. Bilayer Protograph Codes for Half-Duplex Relay Channels

    Science.gov (United States)

    Divsalar, Dariush; VanNguyen, Thuy; Nosratinia, Aria

    2013-01-01

    re-optimization. The main problem of half-duplex relay coding can be reduced to the simultaneous design of two codes at two rates and two SNRs (signal-to-noise ratios), such that one is a subset of the other. This problem can be addressed by forceful optimization, but a clever method of addressing this problem is via the bilayer lengthened (BL) LDPC structure. This method uses a bilayer Tanner graph to make the two codes while using a concept of "parity forwarding" with subsequent successive decoding that removes the need to directly address the issue of uneven SNRs among the symbols of a given codeword. This method is attractive in that it addresses some of the main issues in the design of relay codes, but it does not by itself give rise to highly structured codes with simple encoding, nor does it give rate-compatible codes. The main contribution of this work is to construct a class of codes that simultaneously possess a bilayer parity- forwarding mechanism, while also benefiting from the properties of protograph codes having an easy encoding, a modular design, and being a rate-compatible code.

  13. Functions of Code-Switching among Iranian Advanced and Elementary Teachers and Students

    Science.gov (United States)

    Momenian, Mohammad; Samar, Reza Ghafar

    2011-01-01

    This paper reports on the findings of a study carried out on the advanced and elementary teachers' and students' functions and patterns of code-switching in Iranian English classrooms. This concept has not been adequately examined in L2 (second language) classroom contexts than in outdoor natural contexts. Therefore, besides reporting on the…

  14. Optical information encryption based on incoherent superposition with the help of the QR code

    Science.gov (United States)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  15. Energy-Efficient Cluster Based Routing Protocol in Mobile Ad Hoc Networks Using Network Coding

    Directory of Open Access Journals (Sweden)

    Srinivas Kanakala

    2014-01-01

    Full Text Available In mobile ad hoc networks, all nodes are energy constrained. In such situations, it is important to reduce energy consumption. In this paper, we consider the issues of energy efficient communication in MANETs using network coding. Network coding is an effective method to improve the performance of wireless networks. COPE protocol implements network coding concept to reduce number of transmissions by mixing the packets at intermediate nodes. We incorporate COPE into cluster based routing protocol to further reduce the energy consumption. The proposed energy-efficient coding-aware cluster based routing protocol (ECCRP scheme applies network coding at cluster heads to reduce number of transmissions. We also modify the queue management procedure of COPE protocol to further improve coding opportunities. We also use an energy efficient scheme while selecting the cluster head. It helps to increase the life time of the network. We evaluate the performance of proposed energy efficient cluster based protocol using simulation. Simulation results show that the proposed ECCRP algorithm reduces energy consumption and increases life time of the network.

  16. User`s manual for the CC3 computer models of the concept for disposal of Canada`s nuclear fuel waste

    Energy Technology Data Exchange (ETDEWEB)

    Dougan, K D; Wojciechowski, L C

    1995-06-01

    Atomic Energy of Canada Limited (AECL) is assessing a concept for disposing of CANDU reactor fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept, and for environmental assessments of other systems. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system, which is represented by the CC3 (Canadian Concept, generation 3) models comprising a design-specific vault, a site-specific geosphere and a biosphere typical of the Canadian Shield. (author). 23 refs., 7 tabs., 21 figs.

  17. Cultural Anthropology Study on Historical Narrative and Jade Mythological Concepts in Records of the Great Historian: Annals of the First Emperor of Qin

    Directory of Open Access Journals (Sweden)

    JUAN WU

    2016-10-01

    Full Text Available This paper takes Records of the Great Historian: Annals of the First Emperor of Qin, an essential historical narrative at the dawning of Chinese civilization, as a case to illustrate the causality of historical incidents and the underlying mythological concepts, reveal the underlying mythological concepts that dominate the ritual behaviors and narrative expressions, and highlight the prototype function of mythological concepts in the man’s behavior and ideology construction. Once the prototype of certain cultural community is revealed, the evolvement track of its historical cultural texts and the operative relations between coding and re-coding will be better understood.

  18. Converter of a continuous code into the Grey code

    International Nuclear Information System (INIS)

    Gonchar, A.I.; TrUbnikov, V.R.

    1979-01-01

    Described is a converter of a continuous code into the Grey code used in a 12-charged precision amplitude-to-digital converter to decrease the digital component of spectrometer differential nonlinearity to +0.7% in the 98% range of the measured band. To construct the converter of a continuous code corresponding to the input signal amplitude into the Grey code used is the regularity in recycling of units and zeroes in each discharge of the Grey code in the case of a continuous change of the number of pulses of a continuous code. The converter is constructed on the elements of 155 series, the frequency of continuous code pulse passing at the converter input is 25 MHz

  19. Plant Control Concept for the Sodium Cooled Fast Reactor

    International Nuclear Information System (INIS)

    Kim, Eui Kwang; Kim, S. O.

    2010-12-01

    A power plant is designed for incorporation into a utility's grid system and follows the load demand through the steam generator, intermediate heat exchanger(IHX), from the nuclear core. During the load-following transients, various plant parameters must be controlled to protect the reactor core and other components in the plant. The purpose of this report is to review design considerations to establish SFR plant control and to design plant control concepts. The governing equations and solution procedure of the computer code to calculate plant temperature conditions during the part-load operation was reviewed and 4 types of plant operation concepts were designed, and the results of the calculations were compared

  20. Death of a dogma: eukaryotic mRNAs can code for more than one protein.

    Science.gov (United States)

    Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier

    2016-01-08

    mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5' UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3' UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Wavelength-encoding/temporal-spreading optical code division multiple-access system with in-fiber chirped moiré gratings.

    Science.gov (United States)

    Chen, L R; Smith, P W; de Sterke, C M

    1999-07-20

    We propose an optical code division multiple-access (OCDMA) system that uses in-fiber chirped moiré gratings (CMG's) for encoding and decoding of broadband pulses. In reflection the wavelength-selective and dispersive nature of CMG's can be used to implement wavelength-encoding/temporal-spreading OCDMA. We give examples of codes designed around the constraints imposed by the encoding devices and present numerical simulations that demonstrate the proposed concept.

  2. A study of concept-based similarity approaches for recommending program examples

    Science.gov (United States)

    Hosseini, Roya; Brusilovsky, Peter

    2017-07-01

    This paper investigates a range of concept-based example recommendation approaches that we developed to provide example-based problem-solving support in the domain of programming. The goal of these approaches is to offer students a set of most relevant remedial examples when they have trouble solving a code comprehension problem where students examine a program code to determine its output or the final value of a variable. In this paper, we use the ideas of semantic-level similarity-based linking developed in the area of intelligent hypertext to generate examples for the given problem. To determine the best-performing approach, we explored two groups of similarity approaches for selecting examples: non-structural approaches focusing on examples that are similar to the problem in terms of concept coverage and structural approaches focusing on examples that are similar to the problem by the structure of the content. We also explored the value of personalized example recommendation based on student's knowledge levels and learning goal of the exercise. The paper presents concept-based similarity approaches that we developed, explains the data collection studies and reports the result of comparative analysis. The results of our analysis showed better ranking performance of the personalized structural variant of cosine similarity approach.

  3. Code-specific learning rules improve action selection by populations of spiking neurons.

    Science.gov (United States)

    Friedrich, Johannes; Urbanczik, Robert; Senn, Walter

    2014-08-01

    Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.

  4. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  5. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  6. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  7. Trust, Personal Moral Codes, and the Resource-Advantage Theory of Competition: Explaining Productivity, Economic Growth, and Wealth Creation

    Directory of Open Access Journals (Sweden)

    Shelby D. Hunt

    2012-06-01

    Full Text Available Scholars agree that societal-level moral codes that promote social trust also promote wealth creation.  However, what specific kinds of societal-level moral codes promote social trust?  Also, by what specific kind of competitive process does social trust promote wealth creation?  Because societal-level moral codes are composed of or formed from peoples’ personal moral codes, this article explores a theory of ethics, known as the “Hunt-Vitell” theory of ethics, that illuminates the concept of personal moral codes and uses the theory to discuss which types of personal moral codes foster trust and distrust in society.  This article then uses resource-advantage (R-A theory, one of the most completely articulated dynamic theories of competition, to show the process by which trust-promoting, societal-level moral codes promote productivity and economic growth.  That is, they promote wealth creation.

  8. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  9. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  10. Performance and complexity of tunable sparse network coding with gradual growing tuning functions over wireless networks

    OpenAIRE

    Garrido Ortiz, Pablo; Sørensen, Chres W.; Lucani Roetter, Daniel Enrique; Agüero Calvo, Ramón

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and comp...

  11. Technology concept in the view of Iranian nurses.

    Science.gov (United States)

    Mehraban, Marzieh Adel; Hassanpour, Marzieh; Yazdannik, Ahmadreza; Ajami, Sima

    2013-05-01

    Over the years, the concept technology has modified, especially from the viewpoint of the development of scientific knowledge as well as the philosophical and artistic aspects. However, the concept of technology in nursing are still poorly understood. Only small qualitative studies, especially in Iran, have investigated this phenomenon and they just are about information technology. The aim of this study is to gain a better understanding of the concept of technology in the view of Iranian nurses. This study was qualitative explorative study which was done with a purposeful sampling of 23 nurses (staffs, supervisors and chief nurse managers) working in Isfahan hospitals. Unstructured interviews were including 13 individual interviews and 2 focused-group interviews. In addition to this, filed notes and memos were used in data collection. After this data transcribing was done and then conventional content analysis was used for data coding and classification. The results showed that there are various definitions for technology among nurses. In the view of nurses, technology means using new equipment, computers, information technology, etc). Data analysis revealed that nurses understand technology up to three main concepts: Change, Equipment and Knowledge. In deep overview on categories, we found that the most important concept about technology in nursing perspective is equipment. Therefore, it is necessary to develop deep understanding about the possible concepts technology among nurses. We suppose that technology concepts must be defined separately in all disciplines.

  12. A primer on physical-layer network coding

    CERN Document Server

    Liew, Soung Chang; Zhang, Shengli

    2015-01-01

    The concept of physical-layer network coding (PNC) was proposed in 2006 for application in wireless networks. Since then it has developed into a subfield of communications and networking with a wide following. This book is a primer on PNC. It is the outcome of a set of lecture notes for a course for beginning graduate students at The Chinese University of Hong Kong. The target audience is expected to have some prior background knowledge in communication theory and wireless communications, but not working knowledge at the research level. Indeed, a goal of this book/course is to allow the reader

  13. Validation of the ORIGEN-S code for predicting radionuclide inventories in used CANDU fuel

    International Nuclear Information System (INIS)

    Tait, J.C.; Gauld, I.; Kerr, A.H.

    1995-01-01

    The safety assessment being conducted by AECL Research for the concept of deep geological disposal of used CANDU UO 2 fuel requires the calculation of radionuclide inventories in the fuel to provide source terms for radionuclide release. This report discusses the validation of selected actinide and fission-product inventories calculated using the ORIGEN-S code coupled with the WIMS-AECL lattice code, using data from analytical measurements of radioisotope inventories in Pickering CANDU reactor fuel. The recent processing of new ENDF/B-VI cross-section data has allowed the ORIGEN-S calculations to be performed using the most up-to-date nuclear data available. The results indicate that the code is reliably predicting actinide and the majority of fission-product inventories to within the analytical uncertainty. ((orig.))

  14. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  15. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  16. Crucial steps to life: From chemical reactions to code using agents.

    Science.gov (United States)

    Witzany, Guenther

    2016-02-01

    The concepts of the origin of the genetic code and the definitions of life changed dramatically after the RNA world hypothesis. Main narratives in molecular biology and genetics such as the "central dogma," "one gene one protein" and "non-coding DNA is junk" were falsified meanwhile. RNA moved from the transition intermediate molecule into centre stage. Additionally the abundance of empirical data concerning non-random genetic change operators such as the variety of mobile genetic elements, persistent viruses and defectives do not fit with the dominant narrative of error replication events (mutations) as being the main driving forces creating genetic novelty and diversity. The reductionistic and mechanistic views on physico-chemical properties of the genetic code are no longer convincing as appropriate descriptions of the abundance of non-random genetic content operators which are active in natural genetic engineering and natural genome editing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. GOTHIC code evaluation of alternative passive containment cooling features

    International Nuclear Information System (INIS)

    Gavrilas, M.; Todreas, E.N.; Driscoll, M.J.

    1996-01-01

    Reliance on passive cooling has become an important objective in containment design. Several reactor concepts have been set forth, which are equipped with entirely passively cooled containments. However, the problems that have to be overcome in rejecting the entire heat generated by a severe accident in a high-rating reactor (i.e. one with a rating greater than 1200 MW e ) have been found to be substantial and without obvious solutions. The GOTHIC code was verified and modified for containment cooling applications; optimal mesh sizes, computational time steps and applicable heat transfer correlations were examined. The effect of the break location on circulation patterns that develop inside the containment was also evaluated. The GOTHIC code was then employed to assess the effectiveness of several original heat rejection features that make it possible to cool high-rating containments. Two containment concepts were evaluated: one for a 1200 MW e new pressure tube light-water reactor, and one for a 1300 MW e pressurized-water reactor. The effectiveness of various containment configurations that include specific pressure-limiting features has been predicted. The best-performance configurations-worst-case-accident scenarios that were examined yielded peak pressures of less than 0.30 MPa for the 1200 MW e pressure tube light-water reactor, and less than 0.45 MPa for the 1300 MW e pressurized-water reactor. (orig.)

  18. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  19. Amino acid codes in mitochondria as possible clues to primitive codes

    Science.gov (United States)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  20. Exoplanet Yield Estimation for Decadal Study Concepts using EXOSIMS

    Science.gov (United States)

    Morgan, Rhonda; Lowrance, Patrick; Savransky, Dmitry; Garrett, Daniel

    2016-01-01

    The anticipated upcoming large mission study concepts for the direct imaging of exo-earths present an exciting opportunity for exoplanet discovery and characterization. While these telescope concepts would also be capable of conducting a broad range of astrophysical investigations, the most difficult technology challenges are driven by the requirements for imaging exo-earths. The exoplanet science yield for these mission concepts will drive design trades and mission concept comparisons.To assist in these trade studies, the Exoplanet Exploration Program Office (ExEP) is developing a yield estimation tool that emphasizes transparency and consistent comparison of various design concepts. The tool will provide a parametric estimate of science yield of various mission concepts using contrast curves from physics-based model codes and Monte Carlo simulations of design reference missions using realistic constraints, such as solar avoidance angles, the observatory orbit, propulsion limitations of star shades, the accessibility of candidate targets, local and background zodiacal light levels, and background confusion by stars and galaxies. The python tool utilizes Dmitry Savransky's EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) design reference mission simulator that is being developed for the WFIRST Preliminary Science program. ExEP is extending and validating the tool for future mission concepts under consideration for the upcoming 2020 decadal review. We present a validation plan and preliminary yield results for a point design.

  1. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  2. DarcyTools version 3.4 - Concepts, Methods and Equations

    International Nuclear Information System (INIS)

    Svensson, Urban; Kuylenstierna, Hans-Olof; Ferry, Michel

    2010-12-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is a general code for this class of problems, but the analysis of a repository for nuclear waste is the main intended application. A number of novel features are introduced in DarcyTools. The most fundamental is perhaps the method to generate grid properties (DarcyTools is a continuum porous-media code); a fracture network, with properties given to each fracture, is represented in the computational grid by a method that is based on intersecting volumes (fracture volumes and grid cell volumes). This method is believed to result in very accurate anisotropy and connectivity properties. The report focuses on the concepts, assumptions, equations and key features of DarcyTools. The main part of the report is fairly short; a number of appendices give more detailed accounts of various aspects of the code

  3. DarcyTools version 3.4 - Concepts, Methods and Equations

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban; Kuylenstierna, Hans-Olof (Computer-aided Fluid Engineering AB, Lyckeby (Sweden)); Ferry, Michel (MFRDC, Orvault (France))

    2010-12-15

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is a general code for this class of problems, but the analysis of a repository for nuclear waste is the main intended application. A number of novel features are introduced in DarcyTools. The most fundamental is perhaps the method to generate grid properties (DarcyTools is a continuum porous-media code); a fracture network, with properties given to each fracture, is represented in the computational grid by a method that is based on intersecting volumes (fracture volumes and grid cell volumes). This method is believed to result in very accurate anisotropy and connectivity properties. The report focuses on the concepts, assumptions, equations and key features of DarcyTools. The main part of the report is fairly short; a number of appendices give more detailed accounts of various aspects of the code.

  4. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  5. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    Science.gov (United States)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  6. User and reference manual for the KfK code INS

    International Nuclear Information System (INIS)

    Daum, E.

    1993-09-01

    The INS code (Intense Neutron Source) serves to calculate uncollided neutron flux contours, neutron flux volumes and spatial-dependent neutron flux spectra in the test cell of an intense neutron source, of the t-H 2 O or d-Li concept. With the information of the neutron flux spectra the neutron irradiation damage like displacements per atom (DPA), H- and He-production rates and the generation of foreign elements by transmutations can be calculated for any element at any position in the test cell. This manual gives an introduction into the theory of neutron flux calculation of thick targets and neutron irradiation damage calculations. It is explained how the code is working and the handling of the input and output parameters. For each application of the several code modules an example is given. The results like contours, spectra, flux volumes and damage rates are summarized in tabular form and graphically. Damage and element transmutation data have been calculated for 23 isotopes and compared with the DEMO 1st wall values. (orig./HP) [de

  7. Field-programmable beam reconfiguring based on digitally-controlled coding metasurface

    Science.gov (United States)

    Wan, Xiang; Qi, Mei Qing; Chen, Tian Yi; Cui, Tie Jun

    2016-02-01

    Digital phase shifters have been applied in traditional phased array antennas to realize beam steering. However, the phase shifter deals with the phase of the induced current; hence, it has to be in the path of each element of the antenna array, making the phased array antennas very expensive. Metamaterials and/or metasurfaces enable the direct modulation of electromagnetic waves by designing subwavelength structures, which opens a new way to control the beam scanning. Here, we present a direct digital mechanism to control the scattered electromagnetic waves using coding metasurface, in which each unit cell loads a pin diode to produce binary coding states of “1” and “0”. Through data lines, the instant communications are established between the coding metasurface and the internal memory of field-programmable gate arrays (FPGA). Thus, we realize the digital modulation of electromagnetic waves, from which we present the field-programmable reflective antenna with good measurement performance. The proposed mechanism and functional device have great application potential in new-concept radar and communication systems.

  8. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  9. A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals

    Science.gov (United States)

    Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.

    1994-01-01

    Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.

  10. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  11. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  12. A New Concept to Transport a Droplet on Horizontal Hydrophilic/Hydrophobic Surfaces

    International Nuclear Information System (INIS)

    Myong, Hyon Kook

    2014-01-01

    A fluid transport technique is a key issue for the development of microfluidic systems. In this paper, a new concept for transporting a droplet without external power sources is proposed and verified numerically. The proposed device is a heterogeneous surface which has both hydrophilic and hydrophobic horizontal surfaces. The numerical simulation to demonstrate the new concept is conducted by an in-house solution code (PowerCFD) which employs an unstructured cell-centered method based on a conservative pressure-based finite-volume method with interface capturing method (CICSAM) in a volume of fluid (VOF) scheme for phase interface capturing. It is found that the proposed concept for droplet transport shows superior performance for droplet transport in microfluidic systems

  13. Basic Concepts of Reading Instruction

    Directory of Open Access Journals (Sweden)

    Gökhan ARI

    2017-12-01

    Full Text Available Reading act is performed by connected physiological, psychological and cognitive processes. The operations taking place in these processes are expected to continue for life by being developed with certain strategies. A lot of information is gained with reading skill in education life. Therefore, basic concepts that constitute reading education in teaching and improving reading are important for teachers. The aim of this study is to submit information compiled from the literature about reading education process and which basic concepts are used in reading education. While teaching reading from part to whole, from whole to part and interactional approaches are used. From part to whole approach is at the forefront. Then with interactional approach strategies, both code solving and making sense is improved. Teachers should know the characteristics of bouncing, stopping, turning back, and scanning movements of the eye both in code solving and making sense. The teacher should configure the teaching for the students to gain fluid reading elements by making use of reading out and reading silently. After reading act is acquired; good reader characteristics should be gained by improving asking questions, guessing, summarizing, interpretation skills in integrated readings. Reading skill is improved by studies on the text. Therefore, the students should come across texts that are suitable to their levels, textuality and readability criteria. The vocabulary of children should be improved in a planned way with text-based word and meaning studies. Fluid reading, making sense and interpretation skills of children should be pursued with different evaluation types. In the long term, work should be done to make reading a habit for them.

  14. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  15. Playing Music, Playing with Music: A Proposal for Music Coding in Primary School

    Science.gov (United States)

    Baratè, Adriano; Ludovico, Luca Andrea; Mangione, Giuseppina Rita; Rosa, Alessia

    2015-01-01

    In this work we will introduce the concept of "music coding," namely a new discipline that employs basic music activities and simplified languages to teach the computational way of thinking to musically-untrained children who attend the primary school. In this context, music represents both a mean and a goal: in fact, from one side…

  16. Chemical shift-dependent apparent scalar couplings: An alternative concept of chemical shift monitoring in multi-dimensional NMR experiments

    International Nuclear Information System (INIS)

    Kwiatkowski, Witek; Riek, Roland

    2003-01-01

    The paper presents an alternative technique for chemical shift monitoring in a multi-dimensional NMR experiment. The monitored chemical shift is coded in the line-shape of a cross-peak through an apparent residual scalar coupling active during an established evolution period or acquisition. The size of the apparent scalar coupling is manipulated with an off-resonance radio-frequency pulse in order to correlate the size of the coupling with the position of the additional chemical shift. The strength of this concept is that chemical shift information is added without an additional evolution period and accompanying polarization transfer periods. This concept was incorporated into the three-dimensional triple-resonance experiment HNCA, adding the information of 1 H α chemical shifts. The experiment is called HNCA coded HA, since the chemical shift of 1 H α is coded in the line-shape of the cross-peak along the 13 C α dimension

  17. Concept - or no concept

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1999-01-01

    Discussion about concept in industrial companies. A method for mapping of managerial concept in specific area is shown......Discussion about concept in industrial companies. A method for mapping of managerial concept in specific area is shown...

  18. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  19. Concepts of formal concept analysis

    Science.gov (United States)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  20. BANKING ETHICS: MAIN CONCEPTIONS AND PROBLEMS

    Directory of Open Access Journals (Sweden)

    VALENTINA FETINIUC

    2014-10-01

    Full Text Available Banking ethics is a specialized set of ethical standards and rules that should be followed in the activities of financial institutions and employees of the banking sector. But despite the simplicity of the definition, in the modern world, this concept becomes complex and ambiguous. The importance of studying this subject is defined by the fact that the ethical behavior of the bank and bank employees promotes banking. At present there are several conceptions of banking ethics: general ethics, regulated ethics and ethical bank. The most common practice is to regulate internal and external relations of banks and bank workers with ethical codes. At the same time, studies show the existence of problems in the banking standards of ethics, which negatively affects the financial institution. This article is intended to reflect main tendencies and problems of banking ethics at international level and experience of Republic of Moldova in this field.

  1. Validation of the ORIGEN-S code for predicting radionuclide inventories in used CANDU Fuel

    International Nuclear Information System (INIS)

    Tait, J.C.; Gauld, I.; Kerr, A.H.

    1994-10-01

    The safety assessment being conducted by AECL Research for the concept of deep geological disposal of used CANDU UO 2 fuel requires the calculation of radionuclide inventories in the fuel to provide source terms for radionuclide release. This report discusses the validation of selected actinide and fission-product inventories calculated using the ORIGEN-S code coupled with the WIMS-AECL lattice code, using data from analytical measurements of radioisotope inventories in Pickering CANDU reactor fuel. The recent processing of new ENDF/B-VI cross-section data has allowed the ORIGEN-S calculations to be performed using the most up-to-date nuclear data available. The results indicate that the code is reliably predicting actinide and the majority of fission-product inventories to within the analytical uncertainty. 38 refs., 4 figs., 5 tabs

  2. New quantum codes derived from a family of antiprimitive BCH codes

    Science.gov (United States)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  3. Surface acoustic wave coding for orthogonal frequency coded devices

    Science.gov (United States)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  4. Enhancing Elementary Pre-service Teachers' Plant Processes Conceptions

    Science.gov (United States)

    Thompson, Stephen L.; Lotter, Christine; Fann, Xumei; Taylor, Laurie

    2016-06-01

    Researchers examined how an inquiry-based instructional treatment emphasizing interrelated plant processes influenced 210 elementary pre-service teachers' (PTs) conceptions of three plant processes, photosynthesis, cellular respiration, and transpiration, and the interrelated nature of these processes. The instructional treatment required PTs to predict the fate of a healthy plant in a sealed terrarium (Plant-in-a-Jar), justify their predictions, observe the plant over a 5-week period, and complete guided inquiry activities centered on one of the targeted plant processes each week. Data sources included PTs' pre- and post-predictions with accompanying justifications, course artifacts such as weekly terrarium observations and science journal entries, and group models of the interrelated plant processes occurring within the sealed terraria. A subset of 33 volunteer PTs also completed interviews the week the Plant-in-a-Jar scenario was introduced and approximately 4 months after the instructional intervention ended. Pre- and post-predictions from all PTs as well as interview responses from the subgroup of PTs, were coded into categories based on key plant processes emphasized in the Next Generation Science Standards. Study findings revealed that PTs developed more accurate conceptions of plant processes and their interrelated nature as a result of the instructional intervention. Primary patterns of change in PTs' plant process conceptions included development of more accurate conceptions of how water is used by plants, more accurate conceptions of photosynthesis features, and more accurate conceptions of photosynthesis and cellular respiration as transformative processes.

  5. LDPC-coded MIMO optical communication over the atmospheric turbulence channel using Q-ary pulse-position modulation.

    Science.gov (United States)

    Djordjevic, Ivan B

    2007-08-06

    We describe a coded power-efficient transmission scheme based on repetition MIMO principle suitable for communication over the atmospheric turbulence channel, and determine its channel capacity. The proposed scheme employs the Q-ary pulse-position modulation. We further study how to approach the channel capacity limits using low-density parity-check (LDPC) codes. Component LDPC codes are designed using the concept of pairwise-balanced designs. Contrary to the several recent publications, bit-error rates and channel capacities are reported assuming non-ideal photodetection. The atmospheric turbulence channel is modeled using the Gamma-Gamma distribution function due to Al-Habash et al. Excellent bit-error rate performance improvement, over uncoded case, is found.

  6. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  7. La bonne administration en droit communautaire et le Code européen de bonne conduite administrative

    NARCIS (Netherlands)

    Mendes, J.

    2009-01-01

    The Code of Good Administrative Behaviour is an important source for understandingthe principle and concept of good administration in European administrative law, since itencompasses certain aspects that tend to be overlooked by the case law of the European Courts andEuropean law scholars.

  8. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  9. Effect of sexed semen on conception rate for Holsteins in the United States

    Science.gov (United States)

    Effect of sexed-semen breedings on conception rate was investigated using US Holstein field data from January 2006 through October 2008. Sexed-semen breeding status was determined by a National Association of Animal Breeders’ 500-series marketing code or by individual breeding information in a cow o...

  10. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  11. Quantum Codes From Cyclic Codes Over The Ring R 2

    International Nuclear Information System (INIS)

    Altinel, Alev; Güzeltepe, Murat

    2016-01-01

    Let R 2 denotes the ring F 2 + μF 2 + υ 2 + μυ F 2 + wF 2 + μwF 2 + υwF 2 + μυwF 2 . In this study, we construct quantum codes from cyclic codes over the ring R 2 , for arbitrary length n, with the restrictions μ 2 = 0, υ 2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R 2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R 2 and we give an example of quantum error-correcting codes form cyclic codes over R 2 . (paper)

  12. Consideration of creep in design rules of AFCEN RCC-MRx 2012 code

    International Nuclear Information System (INIS)

    Lebarbe, T.; Petesch, C.; Lejeail, Y.; Lamagnere, P.; Dubiez-Le Goff, S.

    2014-01-01

    The 2012 edition of the RCC-MRx Code has been issued in French and English versions by AFCEN (Association Francaise pour les regles de Conception et de Construction des Materiels des Chaudieres Electro-nucleaires). This Code is the result of the merger of the RCC-MX 2008 developed in the context of the research reactor Jules Horowitz Reactor project, in the RCC-MR 2007 which set up rules applicable to the design of components operating at high temperature and to the Vacuum Vessel of ITER. This new edition is the opportunity to publish also the background of the rules. This paper is one illustration of what may be such a document, on a dedicated example, the creep rules. It contains an overview of the design rules associated to the creep damage and explains the purpose and the origins of these rules. This type of exercise is going to be generalized to all the parts of the code in AFCEN technical publications, the criteria. (authors)

  13. Spiritual nursing care: A concept analysis.

    Science.gov (United States)

    Monareng, Lydia V

    2012-10-08

    Although the concept 'spiritual nursing care' has its roots in the history of the nursing profession, many nurses in practice have difficulty integrating the concept into practice. There is an ongoing debate in the empirical literature about its definition, clarity and application in nursing practice. The study aimed to develop an operational definition of the concept and its application in clinical practice. A qualitative study was conducted to explore and describe how professional nurses render spiritual nursing care. A purposive sampling method was used to recruit the sample. Individual and focus group interviews were audio-taped and transcribed verbatim. Trustworthiness was ensured through strategies of truth value, applicability, consistency and neutrality. Data were analysed using the NUD*IST power version 4 software, constant comparison, open, axial and selective coding. Tech's eight steps of analysis were also used, which led to the emergence of themes, categories and sub-categories. Concept analysis was conducted through a comprehensive literature review and as a result 'caring presence' was identified as the core variable from which all the other characteristics of spiritual nursing care arise. An operational definition of spiritual nursing care based on the findings was that humane care is demonstrated by showing caring presence, respect and concern for meeting the needs not only of the body and mind of patients, but also their spiritual needs of hope and meaning in the midst of health crisis, which demand equal attention for optimal care from both religious and nonreligious nurses.

  14. Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes

    Science.gov (United States)

    Lin, Shu

    1998-01-01

    A code trellis is a graphical representation of a code, block or convolutional, in which every path represents a codeword (or a code sequence for a convolutional code). This representation makes it possible to implement Maximum Likelihood Decoding (MLD) of a code with reduced decoding complexity. The most well known trellis-based MLD algorithm is the Viterbi algorithm. The trellis representation was first introduced and used for convolutional codes [23]. This representation, together with the Viterbi decoding algorithm, has resulted in a wide range of applications of convolutional codes for error control in digital communications over the last two decades. There are two major reasons for this inactive period of research in this area. First, most coding theorists at that time believed that block codes did not have simple trellis structure like convolutional codes and maximum likelihood decoding of linear block codes using the Viterbi algorithm was practically impossible, except for very short block codes. Second, since almost all of the linear block codes are constructed algebraically or based on finite geometries, it was the belief of many coding theorists that algebraic decoding was the only way to decode these codes. These two reasons seriously hindered the development of efficient soft-decision decoding methods for linear block codes and their applications to error control in digital communications. This led to a general belief that block codes are inferior to convolutional codes and hence, that they were not useful. Chapter 2 gives a brief review of linear block codes. The goal is to provide the essential background material for the development of trellis structure and trellis-based decoding algorithms for linear block codes in the later chapters. Chapters 3 through 6 present the fundamental concepts, finite-state machine model, state space formulation, basic structural properties, state labeling, construction procedures, complexity, minimality, and

  15. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  16. Understanding Mixed Code and Classroom Code-Switching: Myths and Realities

    Science.gov (United States)

    Li, David C. S.

    2008-01-01

    Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…

  17. On Predictive Coding for Erasure Channels Using a Kalman Framework

    DEFF Research Database (Denmark)

    Arildsen, Thomas; Murthi, Manohar; Andersen, Søren Vang

    2009-01-01

    We present a new design method for robust low-delay coding of autoregressive sources for transmission across erasure channels. It is a fundamental rethinking of existing concepts. It considers the encoder a mechanism that produces signal measurements from which the decoder estimates the original...... signal. The method is based on linear predictive coding and Kalman estimation at the decoder. We employ a novel encoder state-space representation with a linear quantization noise model. The encoder is represented by the Kalman measurement at the decoder. The presented method designs the encoder...... and decoder offline through an iterative algorithm based on closed-form minimization of the trace of the decoder state error covariance. The design method is shown to provide considerable performance gains, when the transmitted quantized prediction errors are subject to loss, in terms of signal-to-noise ratio...

  18. DarcyTools, Version 2.1. Concepts, methods, equations and demo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban; Kuylenstierna, Hans-Olof [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden); Ferry, Michel [MFRDC, Orvault (France)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is a general code for this class of problems, but the analysis of a repository for nuclear waste is the main intended application. A number of novel features are introduced in DarcyTools. The most fundamental is perhaps the method to generate grid properties (DarcyTools is a continuum porous media code); a fracture network, with properties given to each fracture, is represented 'directly' in the computational grid. This method is believed to result in very accurate anisotropy and connectivity properties. The report focuses on the concepts, assumptions, equations and key features of DarcyTools. The main part of the report is fairly short; a number of appendices give more detailed accounts of various aspects of the code.

  19. DarcyTools, Version 2.1. Concepts, methods, equations and demo simulations

    International Nuclear Information System (INIS)

    Svensson, Urban; Kuylenstierna, Hans-Olof; Ferry, Michel

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is a general code for this class of problems, but the analysis of a repository for nuclear waste is the main intended application. A number of novel features are introduced in DarcyTools. The most fundamental is perhaps the method to generate grid properties (DarcyTools is a continuum porous media code); a fracture network, with properties given to each fracture, is represented 'directly' in the computational grid. This method is believed to result in very accurate anisotropy and connectivity properties. The report focuses on the concepts, assumptions, equations and key features of DarcyTools. The main part of the report is fairly short; a number of appendices give more detailed accounts of various aspects of the code

  20. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  1. Moral competence among nurses in Malawi: A concept analysis approach.

    Science.gov (United States)

    Maluwa, Veronica Mary; Gwaza, Elizabeth; Sakala, Betty; Kapito, Esnath; Mwale, Ruth; Haruzivishe, Clara; Chirwa, Ellen

    2018-01-01

    Nurses are expected to provide comprehensive, holistic and ethically accepted care according to their code of ethics and practice. However, in Malawi, this is not always the case. This article analyses moral competence concept using the Walker and Avant's strategy of concept analysis. The aim of this article is to analyse moral competence concept in relation to nursing practice and determine defining attributes, antecedents and consequences of moral competence in nursing practice. Analysis of moral competence concept was done using Walker and Avant's strategy of concept analysis. Deductive analysis was used to find the defining attributes of moral competence, which were kindness, compassion, caring, critical thinking, ethical decision making ability, problem solving, responsibility, discipline, accountability, communication, solidarity, honesty, and respect for human values, dignity and rights. The identified antecedents were personal, cultural and religious values; nursing ethics training, environment and guidance. The consequences of moral competence are team work spirit, effective communication, improved performance and positive attitudes in providing nursing care. Moral competence can therefore be used as a tool to improve care in nursing practice to meet patients' problems and needs and consequently increase public's satisfaction in Malawi.

  2. Concept Mapping zur Unterstützung der differentialdiagnostischen Hypothesenbildung im fallbasierten Online-Lernsystem CASUS: Qualitative Verbesserung der Diagnosefindung durch ICD-10 Kodierung [Concept mapping for supporting the differential diagnostic generation of hypotheses in the case-based online learning system CASUS: Qualitative improvement of dagnostic performance through ICD-10 coding

    Directory of Open Access Journals (Sweden)

    Kernt, Marcus

    2008-08-01

    Full Text Available [english] Introduction: Concept mapping tools have long been established in medical education as an aid for visualizing learning processes in computer-based programs. The case-based learning system CASUS with its mapping tool for visualizing the differential diagnostic reasoning process is an example. It was shown that such tools are well accepted by users and lead to an increased number of diagnostic hypotheses being visualized as maps. However, there is scarce evidence on the quality of user-generated diagnostic hypotheses. This study examines the quality of diagnostic hypotheses obtained with CASUS and whether the quality can be improved through ICD-10 coding as compared with an expert’s solution. Methods: We randomized 192 third-year medical students at the University of Munich into two groups. The students worked in groups of two on one computer. Group A was asked to code their diagnostic hypotheses with an ICD-10 coding browser before entering them into the mapping tool. Group B generated their hypotheses without prior ICD-10 coding. The differential diagnostic reasoning visualizations were analyzed quantitatively and qualitatively. An expert solution was used as reference. Results: Eighty-seven differential diagnoses were evaluated. Group A, using ICD-10 coding, made the correct and precise diagnosis of malaria tropica significantly more often than Group B (p < 0.05. For additional alternative diagnostic hypotheses, no quantitative or qualitative differences were detected. Conclusions: ICD-10 coding in connection with a mapping tool supporting the diagnostic reasoning process improved the accuracy of diagnostic performance in third-year medical students in the case of malaria tropica. [german] Einleitung: Der Einsatz von Concept-Mapping-Tools in computergestützten Lernprogrammen ist in der medizinischen Ausbildung etabliert: Es konnte gezeigt werden, dass diese Werkzeuge zur Visualisierung von Differentialdiagnosen vom Anwender

  3. Concepts of Integration for UAS Operations in the NAS

    Science.gov (United States)

    Consiglio, Maria C.; Chamberlain, James P.; Munoz, Cesar A.; Hoffler, Keith D.

    2012-01-01

    One of the major challenges facing the integration of Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) is the lack of an onboard pilot that can comply with the legal requirement identified in the US Code of Federal Regulations (CFR) that pilots see and avoid other aircraft. UAS will be expected to demonstrate the means to perform the function of see and avoid while preserving the safety level of the airspace and the efficiency of the air traffic system. This paper introduces a Sense and Avoid (SAA) concept for integration of UAS into the NAS that is currently being developed by the National Aeronautics and Space Administration (NASA) and identifies areas that require additional experimental evaluation to further inform various elements of the concept. The concept design rests on interoperability principles that take into account both the Air Traffic Control (ATC) environment as well as existing systems such as the Traffic Alert and Collision Avoidance System (TCAS). Specifically, the concept addresses the determination of well clear values that are large enough to avoid issuance of TCAS corrective Resolution Advisories, undue concern by pilots of proximate aircraft and issuance of controller traffic alerts. The concept also addresses appropriate declaration times for projected losses of well clear conditions and maneuvers to regain well clear separation.

  4. Heat Transfer Behaviour and Thermohydraulics Code Testing for Supercritical Water Cooled Reactors (SCWRs)

    International Nuclear Information System (INIS)

    2014-08-01

    The supercritical water cooled reactor (SCWR) is an innovative water cooled reactor concept which uses water pressurized above its thermodynamic critical pressure as the reactor coolant. This concept offers high thermal efficiencies and a simplified reactor system, and is hence expected to help to improve economic competitiveness. Various kinds of SCWR concepts have been developed, with varying combinations of reactor type (pressure vessel or pressure tube) and core spectrum (thermal, fast or mixed). There is great interest in both developing and developed countries in the research and development (R&D) and conceptual design of SCWRs. Considering the high interest shown in a number of Member States, the IAEA established in 2008 the Coordinated Research Project (CRP) on Heat Transfer Behaviour and Thermo-hydraulics Code Testing for SCWRs. The aim was to foster international collaboration in the R&D of SCWRs in support of Member States’ efforts and under the auspices of the IAEA Nuclear Energy Department’s Technical Working Groups on Advanced Technologies for Light Water Reactors (TWG-LWR) and Heavy Water Reactors (TWG-HWR). The two key objectives of the CRP were to establish accurate databases on the thermohydraulics of supercritical pressure fluids and to test analysis methods for SCWR thermohydraulic behaviour to identify code development needs. In total, 16 institutes from nine Member States and two international organizations were involved in the CRP. The thermohydraulics phenomena investigated in the CRP included heat transfer and pressure loss characteristics of supercritical pressure fluids, development of new heat transfer prediction methods, critical flow during depressurization from supercritical conditions, flow stability and natural circulation in supercritical pressure systems. Two code testing benchmark exercises were performed for steady state heat transfer and flow stability in a heated channel. The CRP was completed with the planned outputs in

  5. Heat Transfer Behaviour and Thermohydraulics Code Testing for Supercritical Water Cooled Reactors (SCWRs)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-08-15

    The supercritical water cooled reactor (SCWR) is an innovative water cooled reactor concept which uses water pressurized above its thermodynamic critical pressure as the reactor coolant. This concept offers high thermal efficiencies and a simplified reactor system, and is hence expected to help to improve economic competitiveness. Various kinds of SCWR concepts have been developed, with varying combinations of reactor type (pressure vessel or pressure tube) and core spectrum (thermal, fast or mixed). There is great interest in both developing and developed countries in the research and development (R&D) and conceptual design of SCWRs. Considering the high interest shown in a number of Member States, the IAEA established in 2008 the Coordinated Research Project (CRP) on Heat Transfer Behaviour and Thermo-hydraulics Code Testing for SCWRs. The aim was to foster international collaboration in the R&D of SCWRs in support of Member States’ efforts and under the auspices of the IAEA Nuclear Energy Department’s Technical Working Groups on Advanced Technologies for Light Water Reactors (TWG-LWR) and Heavy Water Reactors (TWG-HWR). The two key objectives of the CRP were to establish accurate databases on the thermohydraulics of supercritical pressure fluids and to test analysis methods for SCWR thermohydraulic behaviour to identify code development needs. In total, 16 institutes from nine Member States and two international organizations were involved in the CRP. The thermohydraulics phenomena investigated in the CRP included heat transfer and pressure loss characteristics of supercritical pressure fluids, development of new heat transfer prediction methods, critical flow during depressurization from supercritical conditions, flow stability and natural circulation in supercritical pressure systems. Two code testing benchmark exercises were performed for steady state heat transfer and flow stability in a heated channel. The CRP was completed with the planned outputs in

  6. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  7. Collaboration, Coproduction, and Code-Switching: Colonial Cinema and Postcolonial Archaeology

    Directory of Open Access Journals (Sweden)

    Nayoung Aimee Kwon

    2012-12-01

    Full Text Available This article reassesses the issue of colonial collaboration in the Japanese empire by examining the rise of cinematic coproductions between Japanese and Korean filmmakers. By the late 1930s, colonial Korea’s filmmaking industry had been fully subsumed into the Japanese film industry, and regulations were established that required all films to assimilate imperial policies. The colonial government’s active promotion of colonial “collaboration” and “coproduction” between the colonizers and the colonized ideologically worked to obfuscate these increasing restrictions in colonial film productions while producing complex and contentious desires across the colonial divide. The very concepts of “collaboration” and “coproduction” need to be redefined in light of increasingly complex imperial hierarchies and entanglements. Taking the concept of “code-switching” beyond its linguistic origins, this article argues that we must reassess texts of colonial collaboration and coproduction produced at a time when Korean film had to “code-switch” into Japanese—to linguistically, culturally, and politically align itself with the wartime empire. The article argues that recently excavated films from colonial and Cold War archives, such as Spring in the Korean Peninsula, offer a rare glimpse into repressed and contested histories and raise the broader conundrum of accessing and assessing uneasily commingled colonial pasts of Asian-Pacific nations in the ruins of postcolonial aftermath.

  8. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness.

    Science.gov (United States)

    Graneheim, U H; Lundman, B

    2004-02-01

    Qualitative content analysis as described in published literature shows conflicting opinions and unsolved issues regarding meaning and use of concepts, procedures and interpretation. This paper provides an overview of important concepts (manifest and latent content, unit of analysis, meaning unit, condensation, abstraction, content area, code, category and theme) related to qualitative content analysis; illustrates the use of concepts related to the research procedure; and proposes measures to achieve trustworthiness (credibility, dependability and transferability) throughout the steps of the research procedure. Interpretation in qualitative content analysis is discussed in light of Watzlawick et al.'s [Pragmatics of Human Communication. A Study of Interactional Patterns, Pathologies and Paradoxes. W.W. Norton & Company, New York, London] theory of communication.

  9. Plasma chemistry for concept of sustainable development

    International Nuclear Information System (INIS)

    Chernyak, V.Yu.; Nedybaliuk, O.A.; Tsymbaliuk, O.M.; Fedirchuk, I.I.; Chunikhina, K.I.; Martysh, E.V.; Iukhimenko, V.V.; Veremii, Iu.P.; Prisyazhnevych, I.V.; Prysiazhna, O.V.

    2016-01-01

    This work is devoted to the exploration of the compatibility of the hybrid plasma-catalytic conversion of liquid hydrocarbons into syngas with the concept of sustainable development. The results of the experimental investigations indicate the high efficiency of plasma-catalytic conversion of ethanol to syngas and the small amount of waste (a few percent of feedstock weight). The results of the simulation of the kinetics using ZDPlasKin code for traditional thermochemical and hybrid plasma-catalytic conversions indicate some differences in their mechanisms, which lead to the significant changes in the syngas ratio.

  10. Some Families of Asymmetric Quantum MDS Codes Constructed from Constacyclic Codes

    Science.gov (United States)

    Huang, Yuanyuan; Chen, Jianzhang; Feng, Chunhui; Chen, Riqing

    2018-02-01

    Quantum maximal-distance-separable (MDS) codes that satisfy quantum Singleton bound with different lengths have been constructed by some researchers. In this paper, seven families of asymmetric quantum MDS codes are constructed by using constacyclic codes. We weaken the case of Hermitian-dual containing codes that can be applied to construct asymmetric quantum MDS codes with parameters [[n,k,dz/dx

  11. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  12. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  13. Notions and conceptions of parenting and family

    Directory of Open Access Journals (Sweden)

    José Francisco Martínez Licona

    2017-12-01

    Full Text Available Abstract Objective: To show the main conceptions of the parents about the family, in a city of the northeastern of the Mexican Republic. Methods: Study with dominant mixed approach, where the qualitative has a higher prevalence, carried out from the mediational perspective of Psychology, with 1000 parents of the city of San Luis Potosí. The data were coded and grouped by similarity, which gave rise to thematic categories. Results: There were found different axes of rationality with that parents conceive the family, the social function of it, problems being experienced and easy or difficult aspects of parenting, which were related to the biographic data of the groups of families. Conclusions: There is no homogeneous conception about what should be a family, likewise there were found differences between the thought of young families and the thought of families in the middle and late adulthood.

  14. Self-shielding models of MICROX-2 code: Review and updates

    International Nuclear Information System (INIS)

    Hou, J.; Choi, H.; Ivanov, K.N.

    2014-01-01

    Highlights: • The MICROX-2 code has been improved to expand its application to advanced reactors. • New fine-group cross section libraries based on ENDF/B-VII have been generated. • Resonance self-shielding and spatial self-shielding models have been improved. • The improvements were assessed by a series of benchmark calculations against MCNPX. - Abstract: The MICROX-2 is a transport theory code that solves for the neutron slowing-down and thermalization equations of a two-region lattice cell. The MICROX-2 code has been updated to expand its application to advanced reactor concepts and fuel cycle simulations, including generation of new fine-group cross section libraries based on ENDF/B-VII. In continuation of previous work, the MICROX-2 methods are reviewed and updated in this study, focusing on its resonance self-shielding and spatial self-shielding models for neutron spectrum calculations. The improvement of self-shielding method was assessed by a series of benchmark calculations against the Monte Carlo code, using homogeneous and heterogeneous pin cell models. The results have shown that the implementation of the updated self-shielding models is correct and the accuracy of physics calculation is improved. Compared to the existing models, the updates reduced the prediction error of the infinite multiplication factor by ∼0.1% and ∼0.2% for the homogeneous and heterogeneous pin cell models, respectively, considered in this study

  15. DOZIM - evaluation dose code for nuclear accident

    International Nuclear Information System (INIS)

    Oprea, I.; Musat, D.; Ionita, I.

    2008-01-01

    During a nuclear accident an environmentally significant fission products release can happen. In that case it is not possible to determine precisely the air fission products concentration and, consequently, the estimated doses will be affected by certain errors. The stringent requirement to cope with a nuclear accident, even minor, imposes creation of a computation method for emergency dosimetric evaluations needed to compare the measurement data to certain reference levels, previously established. These comparisons will allow a qualified option regarding the necessary actions to diminish the accident effects. DOZIM code estimates the soil contamination and the irradiation doses produced either by radioactive plume or by soil contamination. Irradiations either on whole body or on certain organs, as well as internal contamination doses produced by isotope inhalation during radioactive plume crossing are taken into account. The calculus does not consider neither the internal contamination produced by contaminated food consumption, or that produced by radioactive deposits resuspension. The code is recommended for dose computation on the wind direction, at distances from 10 2 to 2 x 10 4 m. The DOZIM code was utilized for three different cases: - In air TRIGA-SSR fuel bundle destruction with different input data for fission products fractions released into the environment; - Chernobyl-like accident doses estimation; - Intervention areas determination for a hypothetical severe accident at Cernavoda Nuclear Power Plant. For the first case input data and results (for a 60 m emission height without iodine retention on active coal filters) are presented. To summarize, the DOZIM code conception allows the dose estimation for any nuclear accident. Fission products inventory, released fractions, emission conditions, atmospherical and geographical parameters are the input data. Dosimetric factors are included in the program. The program is in FORTRAN IV language and was run on

  16. Performance and Complexity of Tunable Sparse Network Coding with Gradual Growing Tuning Functions over Wireless Networks

    DEFF Research Database (Denmark)

    Garrido, Pablo; Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages...... a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and computational cost. In addition, it would be difficult to implement, due to the feedback delay. In this work...

  17. Design and implementation of safety traceability system for candied fruits based on two-dimension code technology

    Directory of Open Access Journals (Sweden)

    ZHAO Kun

    2014-12-01

    Full Text Available Traceability is the basic principle of food safety.A food safety traceability system based on QR code and cloud computing technology was introduced in this paper.First of all we introduced the QR code technology and the concept of traceability.And then through the field investigation,we analyzed the traceability process.At the same time,we designed the system and database were found,and the consumer experiencing technology is studied.Finally we expounded the traceability information collection,transmission and final presentation style and expected the future development of traceability system.

  18. Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding

    Science.gov (United States)

    Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.

    1977-01-01

    An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.

  19. Academic freedom, analysis, and the Code of Professional Conduct.

    Science.gov (United States)

    Snelling, Paul C; Lipscomb, Martin

    2004-11-01

    Despite nursing's move into higher education, academic freedom has received little attention within the literature. After discussing the concept of academic freedom, this paper argues that there is a potential tension between academic freedom and the requirement to educate student nurses who are fit for practice. One way in which this tension might be revealed is in the marking of student assignments. We ask the question--how should nurse educators mark an essay which is sufficiently analytical but reaches moral conclusions that lie outside the Code of Professional Conduct? We argue that despite an understandable temptation to penalise such an essay, invoking the Code of Professional Conduct to do so, no penalty should be applied, and academic freedom for students within higher education should be encouraged. This is because first, academic freedom is a good in itself especially as it allows unconventional and unpalatable conclusions to be discussed and rebutted, and second, applying a penalty on these grounds is necessarily inconsistent.

  20. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  1. Verification of spectral burn-up codes on 2D fuel assemblies of the GFR demonstrator ALLEGRO reactor

    International Nuclear Information System (INIS)

    Čerba, Štefan; Vrban, Branislav; Lüley, Jakub; Dařílek, Petr; Zajac, Radoslav; Nečas, Vladimír; Haščik, Ján

    2014-01-01

    Highlights: • Verification of the MCNPX, HELIOS and SCALE codes. • MOX and ceramic fuel assembly. • Gas-cooled fast reactor. • Burnup calculation. - Abstract: The gas-cooled fast reactor, which is one of the six GEN IV reactor concepts, is characterized by high operational temperatures and a hard neutron spectrum. The utilization of commonly used spectral codes, developed mainly for LWR reactors operated in the thermal/epithermal neutron spectrum, may be connected with systematic deviations since the main development effort of these codes has been focused on the thermal part of the neutron spectrum. To be able to carry out proper calculations for fast systems the used codes have to account for neutron resonances including the self-shielding effect. The presented study aims at verifying the spectral HELIOS, MCNPX and SCALE codes on the basis of depletion calculations of 2D MOX and ceramic fuel assemblies of the ALLEGRO gas-cooled fast reactor demonstrator in infinite lattice

  2. Neural Elements for Predictive Coding

    Directory of Open Access Journals (Sweden)

    Stewart SHIPP

    2016-11-01

    Full Text Available Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backwards in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forwards and backwards pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a updates in the microcircuitry of primate visual cortex, and (b rapid technical advances made

  3. Neural Elements for Predictive Coding.

    Science.gov (United States)

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many 'illusory' instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry - that neurons with extrinsically bifurcating axons do not project in both directions - has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic 'canonical microcircuit' and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural

  4. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    Science.gov (United States)

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  5. Error floor behavior study of LDPC codes for concatenated codes design

    Science.gov (United States)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  6. Deriving consumer-facing disease concepts for family health histories using multi-source sampling.

    Science.gov (United States)

    Hulse, Nathan C; Wood, Grant M; Haug, Peter J; Williams, Marc S

    2010-10-01

    The family health history has long been recognized as an effective way of understanding individuals' susceptibility to familial disease; yet electronic tools to support the capture and use of these data have been characterized as inadequate. As part of an ongoing effort to build patient-facing tools for entering detailed family health histories, we have compiled a set of concepts specific to familial disease using multi-source sampling. These concepts were abstracted by analyzing family health history data patterns in our enterprise data warehouse, collection patterns of consumer personal health records, analyses from the local state health department, a healthcare data dictionary, and concepts derived from genetic-oriented consumer education materials. Collectively, these sources yielded a set of more than 500 unique disease concepts, represented by more than 2500 synonyms for supporting patients in entering coded family health histories. We expect that these concepts will be useful in providing meaningful data and education resources for patients and providers alike.

  7. Neutronic and thermal-hydraulic coupling using Milonga and OpenFOAM codes: an approach using free software

    International Nuclear Information System (INIS)

    Silva, Vitor Vasconcelos Araújo

    2016-01-01

    The development of a fine mesh coupled neutronics/thermal-hydraulics framework mainly using open source software is presented. The contributions proposed go in two different directions: one, is the focus on the open software development, a concept widely spread in many fields of knowledge but rarely explored in the nuclear engineering field; the second, is the use of operating system shared memory as a fast and reliable storage area to couple the computational fluid dynamics (CFD) software OpenFOAM to the free and flexible reactor core analysis code Milonga. This concept was applied to simulate the behavior of the TRIGA Mark 1 IPR-R1 reactor fuel pin in steady-state mode. The macroscopic cross-sections for the model, a set of two-group cross-sections data, were generated using WIMSD-5B code. The results show that this innovative coupled system gives consistent results, encouraging system further development and its use for complex nuclear systems. (author)

  8. Key-equations for list decoding of Reed-Solomon codes and how to solve them

    DEFF Research Database (Denmark)

    Beelen, Peter; Brander, Kristian

    2010-01-01

    A Reed-Solomon code of length n can be list decoded using the well-known Guruswami-Sudan algorithm. By a result of Alekhnovich (2005) the interpolation part in this algorithm can be done in complexity O(s^4l^4nlog^2nloglogn), where l denotes the designed list size and s the multiplicity parameter....... The parameters l and s are sometimes considered to be constants in the complexity analysis, but for high rate Reed-Solomon codes, their values can be very large. In this paper we will combine ideas from Alekhnovich (2005) and the concept of key equations to get an algorithm that has complexity O(sl^4nlog^2...

  9. A FEW ASPECTS REGARDING THE SIMULATION OF CONTRACT IN THE ROMANIAN CIVIL CODE

    Directory of Open Access Journals (Sweden)

    Tudor Vlad RĂDULESCU

    2017-05-01

    Full Text Available The article aims to analyze some key aspects of simulation in contracts, as regulated by the Romanian Civil Code . The process of simulation will be explained, based on the provisions of the previous Civil Code, but also with reference to the relevant provisions of the legislation of some European countries. The analyse will focus on the apparent act, and also on the secret one and a special emphasis on intention to simulate, animo simulandi, the key aspect of the matter. Also the effects of the simulation will be reviewed, both from the point of view of the parties and that of third parties, the concept of third parties having another meaning in this procedure.

  10. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  11. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  12. Conceptual-driven classification for coding advise in health insurance reimbursement.

    Science.gov (United States)

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in

  13. CodeArmor : Virtualizing the Code Space to Counter Disclosure Attacks

    NARCIS (Netherlands)

    Chen, Xi; Bos, Herbert; Giuffrida, Cristiano

    2017-01-01

    Code diversification is an effective strategy to prevent modern code-reuse exploits. Unfortunately, diversification techniques are inherently vulnerable to information disclosure. Recent diversification-aware ROP exploits have demonstrated that code disclosure attacks are a realistic threat, with an

  14. Development of SAGE, A computer code for safety assessment analyses for Korean Low-Level Radioactive Waste Disposal

    International Nuclear Information System (INIS)

    Zhou, W.; Kozak, Matthew W.; Park, Joowan; Kim, Changlak; Kang, Chulhyung

    2002-01-01

    This paper describes a computer code, called SAGE (Safety Assessment Groundwater Evaluation) to be used for evaluation of the concept for low-level waste disposal in the Republic of Korea (ROK). The conceptual model in the code is focused on releases from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. Doses can be calculated for several biosphere systems including drinking contaminated groundwater, and subsequent contamination of foods, rivers, lakes, or the ocean by that groundwater. The flexibility of the code will permit both generic analyses in support of design and site development activities, and straightforward modification to permit site-specific and design-specific safety assessments of a real facility as progress is made toward implementation of a disposal site. In addition, the code has been written to easily interface with more detailed codes for specific parts of the safety assessment. In this way, the code's capabilities can be significantly expanded as needed. The code has the capability to treat input parameters either deterministic ally or probabilistic ally. Parameter input is achieved through a user-friendly Graphical User Interface.

  15. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  16. Exploring Students' Conceptions of Science Learning via Drawing: A Cross-Sectional Analysis

    Science.gov (United States)

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2017-01-01

    This cross-sectional study explored students' conceptions of science learning via drawing analysis. A total of 906 Taiwanese students in 4th, 6th, 8th, 10th, and 12th grade were asked to use drawing to illustrate how they conceptualise science learning. Students' drawings were analysed using a coding checklist to determine the presence or absence…

  17. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    Science.gov (United States)

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p code (odds ratio 310.0, p coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  18. The ASLOTS concept: An interactive, adaptive decision support concept for Final Approach Spacing of Aircraft (FASA). FAA-NASA Joint University Program

    Science.gov (United States)

    Simpson, Robert W.

    1993-01-01

    This presentation outlines a concept for an adaptive, interactive decision support system to assist controllers at a busy airport in achieving efficient use of multiple runways. The concept is being implemented as a computer code called FASA (Final Approach Spacing for Aircraft), and will be tested and demonstrated in ATCSIM, a high fidelity simulation of terminal area airspace and airport surface operations. Objectives are: (1) to provide automated cues to assist controllers in the sequencing and spacing of landing and takeoff aircraft; (2) to provide the controller with a limited ability to modify the sequence and spacings between aircraft, and to insert takeoffs and missed approach aircraft in the landing flows; (3) to increase spacing accuracy using more complex and precise separation criteria while reducing controller workload; and (4) achieve higher operational takeoff and landing rates on multiple runways in poor visibility.

  19. The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions

    Science.gov (United States)

    Iwamoto, O.; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.

    2016-01-01

    A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.

  20. The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions

    Energy Technology Data Exchange (ETDEWEB)

    Iwamoto, O., E-mail: iwamoto.osamu@jaea.go.jp; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.

    2016-01-15

    A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.

  1. Overall simulation of a HTGR plant with the gas adapted MANTA code

    International Nuclear Information System (INIS)

    Emmanuel Jouet; Dominique Petit; Robert Martin

    2005-01-01

    Full text of publication follows: AREVA's subsidiary Framatome ANP is developing a Very High Temperature Reactor nuclear heat source that can be used for electricity generation as well as cogeneration including hydrogen production. The selected product has an indirect cycle architecture which is easily adapted to all possible uses of the nuclear heat source. The coupling to the applications is implemented through an Intermediate Heat exchanger. The system code chosen to calculate the steady-state and transient behaviour of the plant is based on the MANTA code. The flexible and modular MANTA code that is originally a system code for all non LOCA PWR plant transients, has been the subject of new developments to simulate all the forced convection transients of a nuclear plant with a gas cooled High Temperature Reactor including specific core thermal hydraulics and neutronics modelizations, gas and water steam turbomachinery and control structure. The gas adapted MANTA code version is now able to model a total HTGR plant with a direct Brayton cycle as well as indirect cycles. To validate these new developments, a modelization with the MANTA code of a real plant with direct Brayton cycle has been performed and steady-states and transients compared with recorded thermal hydraulic measures. Finally a comparison with the RELAP5 code has been done regarding transient calculations of the AREVA indirect cycle HTR project plant. Moreover to improve the user-friendliness in order to use MANTA as a systems conception, optimization design tool as well as a plant simulation tool, a Man- Machine-Interface is available. Acronyms: MANTA Modular Advanced Neutronic and Thermal hydraulic Analysis; HTGR High Temperature Gas-Cooled Reactor. (authors)

  2. Numerical analysis on the calandria tubes in the moderator of a heavy water reactor using OpenFOAM and other codes

    International Nuclear Information System (INIS)

    Chang, S.M.; Kim, H.T.

    2013-01-01

    CANDU, a prototype of heavy water reactor is modeled for the moderator system with porous media buoyancy-effect heat-transfer turbulence model. OpenFOAM, a set of C++ classes and libraries developed under the object-oriented concept, is selected as the tool of numerical analysis. The result from this computational code is compared with experiments and other commercial code data through ANSYS-CFX and COMSOL Multi-physics. The three-dimensional code concerning buoyancy force, turbulence, and heat transfer is tested and shown to be successful for the analysis of thermo-hydraulic system of heavy water reactors. (authors)

  3. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  4. J simplified assessment for cracked pipes and elbows in the RSE-M code

    International Nuclear Information System (INIS)

    Delliou, P.L.; Sermage, J.-P.; Gilles, P.; Marie, S.; Kayser, Y.; Barthelet, B.

    2005-01-01

    RSE-M Code provides rules and requirements for in-service inspection of French Pressurized Water Reactor power plant components. Non mandatory guidance is given in the Code for defect assessment in a wide range of configurations: surface cracked pipes and elbows under pressure, moment and thermal loading. The Code provides influence coefficients to calculate stress intensity factors in pipes and elbows containing semi-elliptical surface defects (circumferential or longitudinal). The J assessment method is based on the reference stress concept with two options for reference loads evaluation: 'CEP elastic plastic stress' and 'CLC modified limit load'. This paper presents an overview of all the formulations and namely the case of pipe-to elbow junctions. The paper provides also a description of the very large data base of 2D and 3D J elastic-plastic finite element calculations performed to establish and validate the formulations. Finally an applicability domain of the methods is given ensuring a conservative prediction of J. (authors)

  5. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  6. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  7. Spiritual nursing care: A concept analysis

    Directory of Open Access Journals (Sweden)

    Lydia V. Monareng

    2012-10-01

    Full Text Available Although the concept ‘spiritual nursing care’ has its roots in the history of the nursing profession, many nurses in practice have difficulty integrating the concept into practice. There is an ongoing debate in the empirical literature about its definition, clarity and application in nursing practice. The study aimed to develop an operational definition of the concept and its application in clinical practice. A qualitative study was conducted to explore and describe how professional nurses render spiritual nursing care. A purposive sampling method was used to recruit the sample. Individual and focus group interviews were audio-taped and transcribed verbatim. Trustworthiness was ensured through strategies of truth value, applicability, consistency and neutrality. Data were analysed using the NUD*IST power version 4 software, constant comparison, open, axial and selective coding. Tech’s eight steps of analysis were also used, which led to the emergence of themes, categories and sub-categories. Concept analysis was conducted through a comprehensive literature review and as a result ‘caring presence’ was identified as the core variable from which all the other characteristics of spiritual nursing care arise. An operational definition of spiritual nursing care based on the findings was that humane care is demonstrated by showing caring presence, respect and concern for meeting the needs not only of the body and mind of patients, but also their spiritual needs of hope and meaning in the midst of health crisis, which demand equal attention for optimal care from both religious and nonreligious nurses.

  8. A theoretical concept for a thermal-hydraulic 3D parallel channel core model

    International Nuclear Information System (INIS)

    Hoeld, A.

    2004-01-01

    A detailed description of the theoretical concept of the 3D thermal-hydraulic single- and two-phase flow phenomena is presented. The theoretical concept is based on important development lines such as separate treatment of the mass and energy from the momentum balance eqs. The other line is the establishment of a procedure for the calculation of the mass flow distributions into different parallel channels based on the fact that the sum of pressure decrease terms over a closed loop must stay, despite of un-symmetric perturbations, zero. The concept is realized in the experimental code HERO-X3D, concentrating in a first step on an artificial BWR or PWR core which may consist of a central channel, four quadrants, and a bypass channel. (authors)

  9. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  10. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  11. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  12. Calculations of different transmutation concepts. An international benchmark exercise

    International Nuclear Information System (INIS)

    2000-01-01

    In April 1996, the NEA Nuclear Science Committee (NSC) Expert Group on Physics Aspects of Different Transmutation Concepts launched a benchmark exercise to compare different transmutation concepts based on pressurised water reactors (PWRs), fast reactors, and an accelerator-driven system. The aim was to investigate the physics of complex fuel cycles involving reprocessing of spent PWR reactor fuel and its subsequent reuse in different reactor types. The objective was also to compare the calculated activities for individual isotopes as a function of time for different plutonium and minor actinide transmutation scenarios in different reactor systems. This report gives the analysis of results of the 15 solutions provided by the participants: six for the PWRs, six for the fast reactor and three for the accelerator case. Various computer codes and nuclear data libraries were applied. (author)

  13. Coherent concepts are computed in the anterior temporal lobes.

    Science.gov (United States)

    Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J

    2010-02-09

    In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.

  14. Parallelization characteristics of the DeCART code

    International Nuclear Information System (INIS)

    Cho, J. Y.; Joo, H. G.; Kim, H. Y.; Lee, C. C.; Chang, M. H.; Zee, S. Q.

    2003-12-01

    This report is to describe the parallelization characteristics of the DeCART code and also examine its parallel performance. Parallel computing algorithms are implemented to DeCART to reduce the tremendous computational burden and memory requirement involved in the three-dimensional whole core transport calculation. In the parallelization of the DeCART code, the axial domain decomposition is first realized by using MPI (Message Passing Interface), and then the azimuthal angle domain decomposition by using either MPI or OpenMP. When using the MPI for both the axial and the angle domain decomposition, the concept of MPI grouping is employed for convenient communication in each communication world. For the parallel computation, most of all the computing modules except for the thermal hydraulic module are parallelized. These parallelized computing modules include the MOC ray tracing, CMFD, NEM, region-wise cross section preparation and cell homogenization modules. For the distributed allocation, most of all the MOC and CMFD/NEM variables are allocated only for the assigned planes, which reduces the required memory by a ratio of the number of the assigned planes to the number of all planes. The parallel performance of the DeCART code is evaluated by solving two problems, a rodded variation of the C5G7 MOX three-dimensional benchmark problem and a simplified three-dimensional SMART PWR core problem. In the aspect of parallel performance, the DeCART code shows a good speedup of about 40.1 and 22.4 in the ray tracing module and about 37.3 and 20.2 in the total computing time when using 48 CPUs on the IBM Regatta and 24 CPUs on the LINUX cluster, respectively. In the comparison between the MPI and OpenMP, OpenMP shows a somewhat better performance than MPI. Therefore, it is concluded that the first priority in the parallel computation of the DeCART code is in the axial domain decomposition by using MPI, and then in the angular domain using OpenMP, and finally the angular

  15. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  16. Development of a master model concept for DEMO vacuum vessel

    International Nuclear Information System (INIS)

    Mozzillo, Rocco; Marzullo, Domenico; Tarallo, Andrea; Bachmann, Christian; Di Gironimo, Giuseppe

    2016-01-01

    Highlights: • The present work concerns the development of a first master concept model for DEMO vacuum vessel. • A parametric-associative CAD master model concept of a DEMO VV sector has been developed in accordance with DEMO design guidelines. • A proper CAD design methodology has been implemented in view of the later FEM analyses based on “shell elements”. - Abstract: This paper describes the development of a master model concept of the DEMO vacuum vessel (VV) conducted within the framework of the EUROfusion Consortium. Starting from the VV space envelope defined in the DEMO baseline design 2014, the layout of the VV structure was preliminarily defined according to the design criteria provided in RCC-MRx. A surface modelling technique was adopted and efficiently linked to the finite element (FE) code to simplify future FE analyses. In view of possible changes to shape and structure during the conceptual design activities, a parametric design approach allows incorporating modifications to the model efficiently.

  17. Development of a master model concept for DEMO vacuum vessel

    Energy Technology Data Exchange (ETDEWEB)

    Mozzillo, Rocco; Marzullo, Domenico; Tarallo, Andrea [CREATE, University of Naples Federico II, DII, P.le Tecchio 80, 80125, Naples (Italy); Bachmann, Christian [EUROfusion PMU, Boltzmannstraße 2, 85748 Garching (Germany); Di Gironimo, Giuseppe, E-mail: peppe.digironimo@gmail.com [CREATE, University of Naples Federico II, DII, P.le Tecchio 80, 80125, Naples (Italy)

    2016-11-15

    Highlights: • The present work concerns the development of a first master concept model for DEMO vacuum vessel. • A parametric-associative CAD master model concept of a DEMO VV sector has been developed in accordance with DEMO design guidelines. • A proper CAD design methodology has been implemented in view of the later FEM analyses based on “shell elements”. - Abstract: This paper describes the development of a master model concept of the DEMO vacuum vessel (VV) conducted within the framework of the EUROfusion Consortium. Starting from the VV space envelope defined in the DEMO baseline design 2014, the layout of the VV structure was preliminarily defined according to the design criteria provided in RCC-MRx. A surface modelling technique was adopted and efficiently linked to the finite element (FE) code to simplify future FE analyses. In view of possible changes to shape and structure during the conceptual design activities, a parametric design approach allows incorporating modifications to the model efficiently.

  18. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  19. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  20. Forward Error Correcting Codes for 100 Gbit/s Optical Communication Systems

    DEFF Research Database (Denmark)

    Li, Bomin

    , a denser WDM grid changes the shape of the BER curve based on the analysis of the experimental results, which requires a stronger FEC code. Furthermore, a proof-of-the-concept hardware implementation is presented. The tradeoff between the code length, the CG and the complexity requires more consideration......-complexity low-power-consumption FEC hardware implementation plays an important role in the next generation energy efficient networks. Thirdly, a joint research is required for FEC integrated applications as the error distribution in channels relies on many factors such as non-linearity in long distance optical...... and their associated experimental demonstration and hardware implementation. The demonstrated high CG, flexibility, robustness and scalability reveal the important role of FEC techniques in the next generation high-speed, high-capacity, high performance and energy-efficient fiber-optic data transmission networks....

  1. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  2. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  5. Legitimization Arguments for Procedural Reforms: a semio-linguistic analysis of statement of reasons from the Civil Procedure Code of 1939 and of the draft bill of the New Civil Procedure Code of 2010.

    Directory of Open Access Journals (Sweden)

    Matheus Guarino Sant’Anna Lima de Almeida

    2016-08-01

    Full Text Available This research aims to analyze the arguments of legitimization that were used in the reform of Brazilian procedural legal codes, by comparing the texts of the statement of reasons of the Civil Procedure Code of 1939 and the draft bill of the New Civil Procedure Code. We consider these codes as milestones: the Civil Procedure Code of 1939 was the first one with a national scope; the draft bill of the New Civil Procedure Code was the first one produced during a democratic period. Our goal is to search for similarities and contrasts between the legitimization arguments used in each historical and political period, by asking if they were only arguments to bestow legitimacy to such reforms. We decided to use the methodological tools of sociolinguistic analysis of speech developed by Patrick Charaudeau in his analyses of political speech in order to elucidate how the uses of language and elements of meaning in the speech construction provide justification for the concept of procedure, in both 1939 and 2010. As a result, we conclude that the process of drafting the CPC of 1939 and the New CPC, even if they are very distant in terms of political and historical contexts, they are also very close in their rhetorical construction and their attempt to find justification and adherence. On balance, some of the differences depend on the vocabulary used when the codes were developed, their justification and the need for change. 

  6. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  7. Spirituality in nursing: an analysis of the concept.

    Science.gov (United States)

    Mahlungulu, S N; Uys, L R

    2004-05-01

    There is scientific evidence that the spiritual well being of a person can affect quality of life and the response to illness, pain, suffering and even death. In spite of this evidence, spirituality in nursing has not been examined within a South African context. The purpose of this study was to describe the phenomenon of spirituality from the perspective of nurses and patients/clients with the aim of generating a middle range theory of spiritual care in nursing. A qualitative mode of inquiry using a grounded theory method was applied. A sample of 56 participants composed of 40 nurses, 14 patients and 2 relatives of patients was recruited by theoretical sampling procedure from one public hospital, one private hospital and one hospice setting. Focus group interviews and one on one in depth interviews were conducted. An audio tape recorder was used to record the interviews. Field notes and memos were also kept. Data were collected and analyzed simultaneously. Non numerical Data Qualification Solutions NUDIST software was used to code data into different levels of codes. The results were rich descriptions of the concept of spirituality. This concept was described as a unique individual quest for establishing and, or, maintaining a dynamic transcendent relationship with self, others and with God/supernatural being as understood by the person. Faith, trust and religious belief were reported as antecedents of spirituality, while hope, inner peace and meaningful life were reported to be consequences of spirituality.

  8. Protection of data carriers using secure optical codes

    Science.gov (United States)

    Peters, John A.; Schilling, Andreas; Staub, René; Tompkin, Wayne R.

    2006-02-01

    Smartcard technologies, combined with biometric-enabled access control systems, are required for many high-security government ID card programs. However, recent field trials with some of the most secure biometric systems have indicated that smartcards are still vulnerable to well equipped and highly motivated counterfeiters. In this paper, we present the Kinegram Secure Memory Technology which not only provides a first-level visual verification procedure, but also reinforces the existing chip-based security measures. This security concept involves the use of securely-coded data (stored in an optically variable device) which communicates with the encoded hashed information stored in the chip memory via a smartcard reader device.

  9. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  10. Development of Learning Management in Moral Ethics and Code of Ethics of the Teaching Profession Course

    Science.gov (United States)

    Boonsong, S.; Siharak, S.; Srikanok, V.

    2018-02-01

    The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.

  11. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  12. Use of the local-global concept in detecting component vibration in reactors

    International Nuclear Information System (INIS)

    Al-Ammar, M.A.

    1981-01-01

    The local-global concept, based on the detector adjoint function, was used to develop the response of a detector to an absorber vibrating in one dimension. A one-dimensional two-group diffusion code was developed to calculate the frequency dependent detector response as a function of detector and absorber positions for the coupled-core UTR-10 reactor. Results from this code indicated the best possible detector and absorber locations, where more detailed calculations were made using a two-group, three-dimensional diffusion code with finite detector and absorber volumes. An experiment was then designed, for the chosen positions, using a vibrating cadmium absorber with a detector on each side. The assembly was placed in the vertical central stringer of the reactor. Investigations were carried out for vibrations in two flux gradients and experimental data were analyzed in the frequency domain using a microcomputer-based data acquisition system. The experimental investigation showed the validity of the local-global concept. A normalized outputs cross power spectral density was developed that correctly predicted the different flux tilts in the two flux gradients. It was also shown that the frequency response of the local component had a wide plateau region. Monitoring the behavior of the normalized cross power spectral density was thought to be a promising indicator for the detection and localization of malfunctioning vibrating components. It might also be used to detect flux irregularities in the vicinity of a vibrating component

  13. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  14. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  15. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Austregesilo, H.; Velkov, K. [GRS, Garching (Germany)] [and others

    1997-07-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes.

  16. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    International Nuclear Information System (INIS)

    Langenbuch, S.; Austregesilo, H.; Velkov, K.

    1997-01-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes

  17. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Wren, D.J.; Popov, N.; Snell, V.G.

    2004-01-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  18. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  19. New Concept of PLC Modems: Multi-Carrier System for Frequency Selective Slow-Fading Channels Based on Layered SCCC Turbocodes

    Directory of Open Access Journals (Sweden)

    J. Zavrtalek

    2015-09-01

    Full Text Available The article introduces a novel concept of a PLC modem as a complement to the existing G3 and PRIME standards for communications using medium- or high-voltage overhead or cable lines. The proposed concept is based on the fact that the levels of impulse noise and frequency selectivity are lower on high-voltage lines than on low-voltage ones. Also, the demands for “cost-effective” circuitry design are not so crucial as in the case of modems for low-voltage level. In contract to these positive conditions, however, there is the need to overcome much longer distances and to take into account low SNR on the receiving side. With respect to the listed reasons, our concept makes use of MCM, instead of OFDM. The assumption of low SNR is compensated through the use of an efficient channel coding based on a serially concatenated turbo code. In addition, MCM offers lower latency and PAPR compared to OFDM. Therefore, when using MCM, it is possible to excite the line with higher power. The proposed concept has been verified during experimental transmission of testing data over a real, 5 km long, 22kV overhead line.

  20. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  1. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  2. Programmer's guide for the CC3 computer models of the concept for disposal of Canada's nuclear fuel waste

    International Nuclear Information System (INIS)

    Dougan, K.D.

    1996-11-01

    Atomic Energy of Canada Limited (AECL) is assessing a concept for disposing of CANDU reactor fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) was developed as an analytical tool for the postclosure (long-term) assessment of the concept, and for environmental assessments of other systems. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system, which is described by the CC3 (Canadian Concept, generation 3) model. The CC3 model is comprised of the disposal vault submodel, the local geosphere submodel and the biosphere submodel. The CC3 Proarammer's Guide describes the programming philosophy and programming conventions not covered in the project standards. The guide includes a description of the overall logic for the CC3 vault, geosphere, and biosphere submodels. Each of the CC3 submodels is also isolated from the other two submbdels to create autonomous or 'stand-alone' submodels. The techniques used to isolate a CC3 submodel, and in particular to determine the submodells input and output data interface, are described. Structure charts are provided for the CC3 model and stand-alone submodels. This guide is meant as a companion document to the CC3 User's Manual. This guide does not describe how to use the CC3 software. The user should consult the CC3 User's Manual to determine how to configure, compile, link, and run the CC3 source code, as well as how to modify the data in the input files. It is intended that the CC3 code version CC305 be executed with SYVAC3 version SV309 and the Modelling Algorithm Library (ML3) version ML303, both developed for the assessment of the concept. SYVAC3-CC3-ML3 (also referred to as 'SC3') can be run on any platform containing an ANSI FORTRAN 77 compliant compiler. Recommended hardware environments are specified in the CC3 User's Manual. (author)

  3. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  4. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  5. A systematic concept of assuring structural integrity of components and parts for applying to highly ductile materials through brittle material

    International Nuclear Information System (INIS)

    Suzuki, Kazuhiko

    2007-09-01

    Concepts of assuring structural integrity of plant components have been developed under limited conditions of either highly ductile or brittle materials. There are some cases where operation in more and more severe conditions causes a significant reduction in ductility for materials with a high ductility before service. Use of high strength steels with relatively reduced ductility is increasing as industry applications. Current concepts of structural integrity assurance under the limited conditions of material properties or on the requirement of no significant changes in material properties even after long service will fail to incorporate expected technological innovations. A systematic concept of assuring the structural integrity should be developed for applying to highly ductile materials through brittle materials. Objectives of the on-going research are to propose a detail of the systematic concept by considering how we can develop the concept without restricting materials and for systematic considerations on a broad range of material properties from highly ductile materials through brittle materials. First, background of concepts of existing structural codes for components of highly ductile materials or for structural parts of brittle materials are discussed. Next, issues of existing code for parts of brittle materials are identified, and then resolutions to the issues are proposed. Based on the above-mentioned discussions and proposals, a systematic concept is proposed for application to components with reduced ductility materials and for applying to components of materials with significantly changing material properties due to long service. (author)

  6. Examining moral thinking of adolescents through intermediate concepts

    Directory of Open Access Journals (Sweden)

    Frichand Ana

    2011-01-01

    Full Text Available This study examines moral thinking of adolescents through intermediate concepts. Intermediate concepts describe a level of analysis that falls between the general default schemas defined in Kohlberg's theory and specific ethical codes. They are related to the ability to identify good and bad actions and justifications in solving specific moral dilemmas. Participants were adolescent males and females in early, middle and late adolescence. The type of education, expressed antisocial behaviour and the primary group of socialization (family were analyzed as well. The results indicate that the ability to identify good and bad actions and justifications is increasing with age. Female adolescents have higher scores on this ability than male adolescents. Individuals in late adolescence, who are concentrating more on moral values and principles during education, show higher ability in identifying bad actions and justifications. In middle adolescence those who exhibit antisocial behaviour have lower ability in identifying intermediate concepts, compared to their peers who do not show this type of behaviour. Similar results are true for those living in institutions for children without parents and parental care when compared to adolescents who are living with their parents. .

  7. Strict optical orthogonal codes for purely asynchronous code-division multiple-access applications

    Science.gov (United States)

    Zhang, Jian-Guo

    1996-12-01

    Strict optical orthogonal codes are presented for purely asynchronous optical code-division multiple-access (CDMA) applications. The proposed code can strictly guarantee the peaks of its cross-correlation functions and the sidelobes of any of its autocorrelation functions to have a value of 1 in purely asynchronous data communications. The basic theory of the proposed codes is given. An experiment on optical CDMA systems is also demonstrated to verify the characteristics of the proposed code.

  8. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  9. Quantum Codes From Negacyclic Codes over Group Ring ( Fq + υFq) G

    International Nuclear Information System (INIS)

    Koroglu, Mehmet E.; Siap, Irfan

    2016-01-01

    In this paper, we determine self dual and self orthogonal codes arising from negacyclic codes over the group ring ( F q + υF q ) G . By taking a suitable Gray image of these codes we obtain many good parameter quantum error-correcting codes over F q . (paper)

  10. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  11. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  12. Simulation of a gas cooled reactor with the system code CATHARE

    International Nuclear Information System (INIS)

    Bentivoglio, Fabrice; Ruby, Alain; Geffraye, Genevieve; Messie, Anne; Saez, Manuel; Tauveron, Nicolas; Widlund, Ola

    2006-01-01

    In recent years the CEA has commissioned a wide range of feasibility studies of future advanced nuclear reactors, in particular gas-cooled reactors (GCR). This paper presents an overview of the use of the thermohydraulics code CATHARE in these activities. Extensively validated and qualified for pressurized water reactors, CATHARE has been adapted to deal also with gas-cooled reactor applications. Rather than branching off a separate GCR version of CATHARE, new features have been integrated as independent options in the standard version of the code, respecting the same stringent procedures for documentation and maintenance. CATHARE has evolved into an efficient tool for GCR applications, with first results in good agreement with existing experimental data and other codes. The paper give an example among the studies already carried out with CATHARE with the case of the Very High Temperature Reactor (VHTR) concepts. Current and future activities for experimental validation of CATHARE for GCR applications are also discussed. Short-term validation activities are also included with the assessment of the German utility Oberhausen II. For the long term, CEA has initiated an ambitious experimental program ranging from small scale loops for physical correlations to component technology and system demonstration loops. (authors)

  13. Transports of radioactive materials. Legal regulations, safety and security concepts, experience

    International Nuclear Information System (INIS)

    Schwarz, Guenther

    2012-01-01

    In Germany, approximately 650,000 to 750,000 units containing radioactive materials for scientific, medical and technical applications are shipped annually by surface, air and water transports. Legally speaking, radioactive materials are dangerous goods which can cause hazards to life, health, property and the environment as a result of faulty handling or accidents in transit. For protection against these hazards, their shipment therefore is regulated in extensive national and international rules of protection and safety. The article contains a topical review of the international and national transport regulations and codes pertaining to shipments of radioactive materials, and of the protection concepts underlying these codes so as to ensure an adequate standard of safety and security in shipping radioactive materials in national and international goods traffic. (orig.)

  14. Benefits of Low Boron Core Design Concept for PWR

    Energy Technology Data Exchange (ETDEWEB)

    Daing, Aung Tharn; Kim, Myung Hyun [Kyung Hee University, Yongin (Korea, Republic of)

    2009-10-15

    Nuclear design study was carried out to develop low boron core (LBC) based on one of current PWR concepts, OPR-1000. Most of design parameters were the same with those of Ulchin unit-5 except extensive utilization of burnable poison (BP) pins in order to compensate reactivity increase in LBC. For replacement of reduced soluble boron concentration, four different kinds of integral burnable absorbers (IBAs) such as gadolinia, integral fuel burnable absorber (IFBA), erbia and alumina boron carbide were considered in suppressing more excess reactivity. A parametric study was done to find the optimal core options from many design candidates for fuel assemblies and cores. Among them, the most feasible core design candidate was chosen in accordance with general design requirements. In this paper, the feasibility and design change benefits of the most favorable LBC design were investigated in more detail through the comparison of neutronic and thermal hydraulic design parameters of LBC with the reference plant (REF). As calculation tools, the HELIOS/MASTER code package and the MATRA code were utilized. The main purpose of research herein is to estimate feasibility and capability of LBC which was mainly designed to mitigate boron dilution accident (BDA), and for reduction of corrosion products. The LBC design concept using lower boron concentration with an elevated enrichment in {sup 10}B allows a reduction in the concentration of lithium in the primary coolant required to maintain the optimum coolant pH. All in all, LBC with operation at optimum pH is expected to achieve some benefits from radiation source reduction of reduced corrosion product, the limitation of the Axial Offset Anomaly (AOA) and fuel cladding corrosion. Additionally, several merits of LBC are closely related to fluid systems and system related aspects, reduced boron and lithium costs, equipment size reduction for boric acid systems, elimination of heat tracing, and more aggressive fuel design concepts.

  15. Benefits of Low Boron Core Design Concept for PWR

    International Nuclear Information System (INIS)

    Daing, Aung Tharn; Kim, Myung Hyun

    2009-01-01

    Nuclear design study was carried out to develop low boron core (LBC) based on one of current PWR concepts, OPR-1000. Most of design parameters were the same with those of Ulchin unit-5 except extensive utilization of burnable poison (BP) pins in order to compensate reactivity increase in LBC. For replacement of reduced soluble boron concentration, four different kinds of integral burnable absorbers (IBAs) such as gadolinia, integral fuel burnable absorber (IFBA), erbia and alumina boron carbide were considered in suppressing more excess reactivity. A parametric study was done to find the optimal core options from many design candidates for fuel assemblies and cores. Among them, the most feasible core design candidate was chosen in accordance with general design requirements. In this paper, the feasibility and design change benefits of the most favorable LBC design were investigated in more detail through the comparison of neutronic and thermal hydraulic design parameters of LBC with the reference plant (REF). As calculation tools, the HELIOS/MASTER code package and the MATRA code were utilized. The main purpose of research herein is to estimate feasibility and capability of LBC which was mainly designed to mitigate boron dilution accident (BDA), and for reduction of corrosion products. The LBC design concept using lower boron concentration with an elevated enrichment in 10 B allows a reduction in the concentration of lithium in the primary coolant required to maintain the optimum coolant pH. All in all, LBC with operation at optimum pH is expected to achieve some benefits from radiation source reduction of reduced corrosion product, the limitation of the Axial Offset Anomaly (AOA) and fuel cladding corrosion. Additionally, several merits of LBC are closely related to fluid systems and system related aspects, reduced boron and lithium costs, equipment size reduction for boric acid systems, elimination of heat tracing, and more aggressive fuel design concepts

  16. Mobile code security

    Science.gov (United States)

    Ramalingam, Srikumar

    2001-11-01

    A highly secure mobile agent system is very important for a mobile computing environment. The security issues in mobile agent system comprise protecting mobile hosts from malicious agents, protecting agents from other malicious agents, protecting hosts from other malicious hosts and protecting agents from malicious hosts. Using traditional security mechanisms the first three security problems can be solved. Apart from using trusted hardware, very few approaches exist to protect mobile code from malicious hosts. Some of the approaches to solve this problem are the use of trusted computing, computing with encrypted function, steganography, cryptographic traces, Seal Calculas, etc. This paper focuses on the simulation of some of these existing techniques in the designed mobile language. Some new approaches to solve malicious network problem and agent tampering problem are developed using public key encryption system and steganographic concepts. The approaches are based on encrypting and hiding the partial solutions of the mobile agents. The partial results are stored and the address of the storage is destroyed as the agent moves from one host to another host. This allows only the originator to make use of the partial results. Through these approaches some of the existing problems are solved.

  17. ComboCoding: Combined intra-/inter-flow network coding for TCP over disruptive MANETs

    Directory of Open Access Journals (Sweden)

    Chien-Chia Chen

    2011-07-01

    Full Text Available TCP over wireless networks is challenging due to random losses and ACK interference. Although network coding schemes have been proposed to improve TCP robustness against extreme random losses, a critical problem still remains of DATA–ACK interference. To address this issue, we use inter-flow coding between DATA and ACK to reduce the number of transmissions among nodes. In addition, we also utilize a “pipeline” random linear coding scheme with adaptive redundancy to overcome high packet loss over unreliable links. The resulting coding scheme, ComboCoding, combines intra-flow and inter-flow coding to provide robust TCP transmission in disruptive wireless networks. The main contributions of our scheme are twofold; the efficient combination of random linear coding and XOR coding on bi-directional streams (DATA and ACK, and the novel redundancy control scheme that adapts to time-varying and space-varying link loss. The adaptive ComboCoding was tested on a variable hop string topology with unstable links and on a multipath MANET with dynamic topology. Simulation results show that TCP with ComboCoding delivers higher throughput than with other coding options in high loss and mobile scenarios, while introducing minimal overhead in normal operation.

  18. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  19. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  20. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  1. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  2. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jin; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures.

  3. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2004-01-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures

  4. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  5. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    Science.gov (United States)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  6. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  7. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  8. RELAP5/MOD3 code coupling model

    International Nuclear Information System (INIS)

    Martin, R.P.; Johnsen, G.W.

    1994-01-01

    A new capability has been incorporated into RELAP5/MOD3 that enables the coupling of RELAP5/MOD3 to other computer codes. The new capability has been designed to support analysis of the new advanced reactor concepts. Its user features rely solely on new RELAP5 open-quotes styledclose quotes input and the Parallel Virtual Machine (PVM) software, which facilitates process management and distributed communication of multiprocess problems. RELAP5/MOD3 manages the input processing, communication instruction, process synchronization, and its own send and receive data processing. The flexible capability requires that an explicit coupling be established, which updates boundary conditions at discrete time intervals. Two test cases are presented that demonstrate the functionality, applicability, and issues involving use of this capability

  9. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  10. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  11. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  12. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  13. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  14. ogs6 - a new concept for porous-fractured media simulations

    Science.gov (United States)

    Naumov, Dmitri; Bilke, Lars; Fischer, Thomas; Rink, Karsten; Wang, Wenqing; Watanabe, Norihiro; Kolditz, Olaf

    2015-04-01

    OpenGeoSys (OGS) is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THMC) processes in porous and fractured media, continuously developed since the mid-eighties. The basic concept is to provide a flexible numerical framework for solving coupled multi-field problems. OGS is targeting mainly on applications in environmental geoscience, e.g. in the fields of contaminant hydrology, water resources management, waste deposits, or geothermal energy systems, but it has also been successfully applied to new topics in energy storage recently. OGS is actively participating several international benchmarking initiatives, e.g. DECOVALEX (waste management), CO2BENCH (CO2 storage and sequestration), SeSBENCH (reactive transport processes) and HM-Intercomp (coupled hydrosystems). Despite the broad applicability of OGS in geo-, hydro- and energy-sciences, several shortcomings became obvious concerning the computational efficiency as well as the code structure became too sophisticated for further efficient development. OGS-5 was designed for object-oriented FEM applications. However, in many multi-field problems a certain flexibility of tailored numerical schemes is essential. Therefore, a new concept was designed to overcome existing bottlenecks. The paradigms for ogs6 are: - Flexibility of numerical schemes (FEM#FVM#FDM), - Computational efficiency (PetaScale ready), - Developer- and user-friendly. ogs6 has a module-oriented architecture based on thematic libraries (e.g. MeshLib, NumLib) on the large scale and uses object-oriented approach for the small scale interfaces. Usage of a linear algebra library (Eigen3) for the mathematical operations together with the ISO C++11 standard increases the expressiveness of the code and makes it more developer-friendly. The new C++ standard also makes the template meta-programming technique code used for compile-time optimizations more compact. We have transitioned the main code development to

  15. Tri-code inductance control rod position indicator with several multi-coding-bars

    International Nuclear Information System (INIS)

    Shi Jibin; Jiang Yueyuan; Wang Wenran

    2004-01-01

    A control rod position indicator named as tri-code inductance control rod position indicator with multi-coding-bars, which possesses simple structure, reliable operation and high precision, is developed. The detector of the indicator is composed of K coils, a compensatory coil and K coding bars. Each coding bar consists of several sections of strong magnetic cores, several sections of weak magnetic cores and several sections of non-magnetic portions. As the control rod is withdrawn, the coding bars move in the center of the coils respectively, while the constant alternating current passes the coils and makes them to create inductance alternating voltage signals. The outputs of the coils are picked and processed, and the tri-codes indicating rod position can be gotten. Moreover, the coding principle of the detector and its related structure are introduced. The analysis shows that the indicator owns more advantage over the coils-coding rod position indicator, so it can meet the demands of the rod position indicating in nuclear heating reactor (NHR). (authors)

  16. Parallelization of a beam dynamics code and first large scale radio frequency quadrupole simulations

    Directory of Open Access Journals (Sweden)

    J. Xu

    2007-01-01

    Full Text Available The design and operation support of hadron (proton and heavy-ion linear accelerators require substantial use of beam dynamics simulation tools. The beam dynamics code TRACK has been originally developed at Argonne National Laboratory (ANL to fulfill the special requirements of the rare isotope accelerator (RIA accelerator systems. From the beginning, the code has been developed to make it useful in the three stages of a linear accelerator project, namely, the design, commissioning, and operation of the machine. To realize this concept, the code has unique features such as end-to-end simulations from the ion source to the final beam destination and automatic procedures for tuning of a multiple charge state heavy-ion beam. The TRACK code has become a general beam dynamics code for hadron linacs and has found wide applications worldwide. Until recently, the code has remained serial except for a simple parallelization used for the simulation of multiple seeds to study the machine errors. To speed up computation, the TRACK Poisson solver has been parallelized. This paper discusses different parallel models for solving the Poisson equation with the primary goal to extend the scalability of the code onto 1024 and more processors of the new generation of supercomputers known as BlueGene (BG/L. Domain decomposition techniques have been adapted and incorporated into the parallel version of the TRACK code. To demonstrate the new capabilities of the parallelized TRACK code, the dynamics of a 45 mA proton beam represented by 10^{8} particles has been simulated through the 325 MHz radio frequency quadrupole and initial accelerator section of the proposed FNAL proton driver. The results show the benefits and advantages of large-scale parallel computing in beam dynamics simulations.

  17. NIMROD: A Customer Focused, Team Driven Approach for Fusion Code Development

    Science.gov (United States)

    Karandikar, H. M.; Schnack, D. D.

    1996-11-01

    NIMROD is a new code that will be used for the analysis of existing fusion experiments, prediction of operational limits, and design of future devices. An approach called Integrated Product Development (IPD) is being used for the development of NIMROD. It is a dramatic departure from existing practice in the fusion program. Code development is being done by a self-directed, multi-disciplinary, multi-institutional team that consists of experts in plasma theory, experiment, computational physics, and computer science. Customer representatives (ITER, US experiments) are an integral part of the team. The team is using techniques such as Quality Function Deployment (QFD), Pugh Concept Selection, Rapid Prototyping, and Risk Management, during the design phase of NIMROD. Extensive use is made of communication and internet technology to support collaborative work. Our experience with using these team techniques for such a complex software development project will be reported.

  18. Investigations for the EPR-concept - KAPOOL and KATS experiments

    International Nuclear Information System (INIS)

    Engel, G.; Eppinger, B.; Fieg, G.; Schmidt-Stiefel, S.

    2000-01-01

    The objective of the KAPOOL and KATS experiments is to investigate basic phenomena in connection with the EPR melt spreading and cooling concept. High-temperature Al 2 O 3 - and Fe-melts produced by the thermite reaction are used to simulate the oxidic and metallic components of the core melt. Two KAPOOL tests have been performed to study the interaction of the oxidic melt with the release gate which is situated between the cavity and the spreading compartment. These tests have been analyzed with the HEATING-5 code and compared with the experimental results. With test KATS-17 (spreading onto dry concrete for the oxide melt, spreading onto concrete with 1 mm water level for the metallic melt) the series of two-dimensional spreading experiments has been finished. KATS-15 (2-dim spreading on dry ceramics) has been analyzed with the code CORFLOW. (orig.) [de

  19. Fast-neutron, coded-aperture imager

    Science.gov (United States)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  20. Prototyping with your hands: the many roles of gesture in the communication of design concepts

    DEFF Research Database (Denmark)

    Cash, Philip; Maier, Anja

    2016-01-01

    There is an on-going focus exploring the use of gesture in design situations; however, there are still significant questions as to how this is related to the understanding and communication of design concepts. This work explores the use of gesture through observing and video-coding four teams of ...

  1. Concept analysis of moral courage in nursing: A hybrid model.

    Science.gov (United States)

    Sadooghiasl, Afsaneh; Parvizy, Soroor; Ebadi, Abbas

    2018-02-01

    Moral courage is one of the most fundamental virtues in the nursing profession, however, little attention has been paid to it. As a result, no exact and clear definition of moral courage has ever been accessible. This study is carried out for the purposes of defining and clarifying its concept in the nursing profession. This study used a hybrid model of concept analysis comprising three phases, namely, a theoretical phase, field work phase, and a final analysis phase. To find relevant literature, electronic search of valid databases was utilized using keywords related to the concept of courage. Field work data were collected over an 11 months' time period from 2013 to 2014. In the field work phase, in-depth interviews were performed with 10 nurses. The conventional content analysis was used in two theoretical and field work phases using Graneheim and Lundman stages, and the results were combined in the final analysis phase. Ethical consideration: Permission for this study was obtained from the ethics committee of Tehran University of Medical Sciences. Oral and written informed consent was received from the participants. From the sum of 750 gained titles in theoretical phase, 26 texts were analyzed. The analysis resulted in 494 codes in text analysis and 226 codes in interview analysis. The literature review in the theoretical phase revealed two features of inherent-transcendental characteristics, two of which possessed a difficult nature. Working in the field phase added moral self-actualization characteristic, rationalism, spiritual beliefs, and scientific-professional qualifications to the feature of the concept. Moral courage is a pure and prominent characteristic of human beings. The antecedents of moral courage include model orientation, model acceptance, rationalism, individual excellence, acquiring academic and professional qualification, spiritual beliefs, organizational support, organizational repression, and internal and external personal barriers

  2. GPU-accelerated 3D neutron diffusion code based on finite difference method

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Q.; Yu, G.; Wang, K. [Dept. of Engineering Physics, Tsinghua Univ. (China)

    2012-07-01

    Finite difference method, as a traditional numerical solution to neutron diffusion equation, although considered simpler and more precise than the coarse mesh nodal methods, has a bottle neck to be widely applied caused by the huge memory and unendurable computation time it requires. In recent years, the concept of General-Purpose computation on GPUs has provided us with a powerful computational engine for scientific research. In this study, a GPU-Accelerated multi-group 3D neutron diffusion code based on finite difference method was developed. First, a clean-sheet neutron diffusion code (3DFD-CPU) was written in C++ on the CPU architecture, and later ported to GPUs under NVIDIA's CUDA platform (3DFD-GPU). The IAEA 3D PWR benchmark problem was calculated in the numerical test, where three different codes, including the original CPU-based sequential code, the HYPRE (High Performance Pre-conditioners)-based diffusion code and CITATION, were used as counterpoints to test the efficiency and accuracy of the GPU-based program. The results demonstrate both high efficiency and adequate accuracy of the GPU implementation for neutron diffusion equation. A speedup factor of about 46 times was obtained, using NVIDIA's Geforce GTX470 GPU card against a 2.50 GHz Intel Quad Q9300 CPU processor. Compared with the HYPRE-based code performing in parallel on an 8-core tower server, the speedup of about 2 still could be observed. More encouragingly, without any mathematical acceleration technology, the GPU implementation ran about 5 times faster than CITATION which was speeded up by using the SOR method and Chebyshev extrapolation technique. (authors)

  3. GPU-accelerated 3D neutron diffusion code based on finite difference method

    International Nuclear Information System (INIS)

    Xu, Q.; Yu, G.; Wang, K.

    2012-01-01

    Finite difference method, as a traditional numerical solution to neutron diffusion equation, although considered simpler and more precise than the coarse mesh nodal methods, has a bottle neck to be widely applied caused by the huge memory and unendurable computation time it requires. In recent years, the concept of General-Purpose computation on GPUs has provided us with a powerful computational engine for scientific research. In this study, a GPU-Accelerated multi-group 3D neutron diffusion code based on finite difference method was developed. First, a clean-sheet neutron diffusion code (3DFD-CPU) was written in C++ on the CPU architecture, and later ported to GPUs under NVIDIA's CUDA platform (3DFD-GPU). The IAEA 3D PWR benchmark problem was calculated in the numerical test, where three different codes, including the original CPU-based sequential code, the HYPRE (High Performance Pre-conditioners)-based diffusion code and CITATION, were used as counterpoints to test the efficiency and accuracy of the GPU-based program. The results demonstrate both high efficiency and adequate accuracy of the GPU implementation for neutron diffusion equation. A speedup factor of about 46 times was obtained, using NVIDIA's Geforce GTX470 GPU card against a 2.50 GHz Intel Quad Q9300 CPU processor. Compared with the HYPRE-based code performing in parallel on an 8-core tower server, the speedup of about 2 still could be observed. More encouragingly, without any mathematical acceleration technology, the GPU implementation ran about 5 times faster than CITATION which was speeded up by using the SOR method and Chebyshev extrapolation technique. (authors)

  4. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  5. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  6. An Analysis of the Changes in Communication Techniques in the Italian Codes of Medical Deontology.

    Science.gov (United States)

    Conti, Andrea Alberto

    2017-04-28

    The code of deontology of the Italian National Federation of the Colleges of Physicians, Surgeons and Dentists (FNOMCeO) contains the principles and rules to which the professional medical practitioner must adhere. This work identifies and analyzes the medical-linguistic choices and the expressive techniques present in the different editions of the code, and evaluates their purpose and function, focusing on the first appearance and the subsequent frequency of key terms. Various aspects of the formal and expressive revisions of the eight editions of the Codes of Medical Deontology published after the Second World War (from 1947/48 to 2014) are here presented, starting from a brief comparison with the first edition of 1903. Formal characteristics, choices of medical terminology and the introduction of new concepts and communicative attitudes are here identified and evaluated. This paper, in presenting a quantitative and epistemological analysis of variations, modifications and confirmations in the different editions of the Italian code of medical deontology over the last century, enucleates and demonstrates the dynamic paradigm of changing attitudes in the medical profession. This analysis shows the evolution in medical-scientific communication as embodied in the Italian code of medical deontology. This code, in its adoption, changes and adaptations, as evidenced in its successive editions, bears witness to the expressions and attitudes pertinent to and characteristic of the deontological stance of the medical profession during the twentieth century.

  7. A 2D benchmark for the verification of the PEBBED code

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Gougar, Hans D.; Ougouag, Abderrafi M.

    2008-01-01

    A new benchmarking concept is presented for verifying the PEBBED 3D multigroup finite difference/nodal diffusion code with application to pebble bed modular reactors (PBMRs). The key idea is to perform convergence acceleration, also called extrapolation to zero discretization, of a basic finite difference numerical algorithm to give extremely high accuracy. The method is first demonstrated on a 1D cylindrical shell and then on an r,Θ wedge where the order of the second order finite difference scheme is confirmed to four places. (authors)

  8. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  9. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  10. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  11. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  12. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  13. Towards a concept of Communicative Competence in Health: a qualitative study in medical residents

    Directory of Open Access Journals (Sweden)

    Rodolfo A. Cabrales

    2015-06-01

    Full Text Available Despite the wealth of literature surrounding the importance of effective communication in the clinical practice, there is a dearth of consensus in the literature on what communicative competence in health (CCH is, and the practices of meaningful health communication. Seventeen residents (17 were invited to share their thoughts on the concept of communicative competence in health and on difficulties they encounter during their clinical practice related with communication. The aim of this study was to gain a better understanding of CCH with emphasis on the implications in the medical curriculum, teaching, learning and assessment. Three focus group discussions were conducted with the clinical supervisor. The results were audio-taped, transcribed verbatim and analyzed using principles from grounded theory for qualitative data analysis. The 135 open codes and defined axial codes were discussed and a number of conceptual frameworks were utilized to disentangle the concept of CCH. The focus group themes related to the concept of communication in health, its importance and difficulties, the role of the physician and health personnel. The participants felt their own training did not prepare them to establish effective communication with patients and relatives. Some barriers include lack of time and lack of institutional priority given to communication issues. The techniques originating from grounded theory permitted to define a broader concept of CCH with the following three specific scopes: biological perspective (objective world, social (social world and subjective world (expressive-aesthetic. This new concept of CCH is central to understanding how the health communication process occurs, where a myriad of individual (physician, patient, staff, relatives, organizational and societal interrelated factors influence health decisions and practice. These components need to be addressed by medicine schools, health institutions and other stakeholders in

  14. Architecture proposal for the use of QR code in supply chain management

    Directory of Open Access Journals (Sweden)

    Dalton Matsuo Tavares

    2012-01-01

    Full Text Available Supply chain traceability and visibility are key concerns for many companies. Radio-Frequency Identification (RFID is an enabling technology that allows identification of objects in a fully automated manner via radio waves. Nevertheless, this technology has limited acceptance and high costs. This paper presents a research effort undertaken to design a track and trace solution in supply chains, using quick response code (or QR Code for short as a less complex and cost-effective alternative for RFID in supply chain management (SCM. A first architecture proposal using open source software will be presented as a proof of concept. The system architecture is presented in order to achieve tag generation, the image acquisition and pre-processing, product inventory and tracking. A prototype system for the tag identification is developed and discussed at the end of the paper to demonstrate its feasibility.

  15. Mathematical fundamentals for the noise immunity of the genetic code.

    Science.gov (United States)

    Fimmel, Elena; Strüngmann, Lutz

    2018-02-01

    Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of

  16. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  17. Future direction of ASME nuclear codes and standards

    International Nuclear Information System (INIS)

    Ennis, Kevin; Sheehan, Mark E.

    2003-01-01

    While the nuclear power industry in the US is in a period of stasis, there continues to be a great deal of activity in the ASME nuclear standards development arena. As plants age, the need for new approaches in standardization changes with the changing needs of the industry. New tools are becoming available in the form of risk analysis, and this is finding its way into more and more of ASME's standards activities. This paper will take a look at the direction that ASME nuclear Codes and Standards are heading in this and other areas, as well as taking a look at some advance reactor concepts and plans for standards to address new technologies

  18. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  19. Fast-neutron, coded-aperture imager

    International Nuclear Information System (INIS)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-01-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  20. Fast-neutron, coded-aperture imager

    Energy Technology Data Exchange (ETDEWEB)

    Woolf, Richard S., E-mail: richard.woolf@nrl.navy.mil; Phlips, Bernard F., E-mail: bernard.phlips@nrl.navy.mil; Hutcheson, Anthony L., E-mail: anthony.hutcheson@nrl.navy.mil; Wulf, Eric A., E-mail: eric.wulf@nrl.navy.mil

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  1. The Analysis of SBWR Critical Power Bundle Using Cobrag Code

    Directory of Open Access Journals (Sweden)

    Yohannes Sardjono

    2013-03-01

    Full Text Available The coolant mechanism of SBWR is similar with the Dodewaard Nuclear Power Plant (NPP in the Netherlands that first went critical in 1968. The similarity of both NPP is cooled by natural convection system. These coolant concept is very related with same parameters on fuel bundle design especially fuel bundle length, core pressure drop and core flow rate as well as critical power bundle. The analysis was carried out by using COBRAG computer code. COBRAG computer code is GE Company proprietary. Basically COBRAG computer code is a tool to solve compressible three-dimensional, two fluid, three field equations for two phase flow. The three fields are the vapor field, the continuous liquid field, and the liquid drop field. This code has been applied to analyses model flow and heat transfer within the reactor core. This volume describes the finitevolume equations and the numerical solution methods used to solve these equations. This analysis of same parameters has been done i.e.; inlet sub cooling 20 BTU/lbm and 40 BTU/lbm, 1000 psi pressure and R-factor is 1.038, mass flux are 0.5 Mlb/hr.ft2, 0.75 Mlb/hr.ft2, 1.00 Mlb/hr.ft2 and 1.25 Mlb/hr.ft2. Those conditions based on history operation of some type of the cell fuel bundle line at GE Nuclear Energy. According to the results, it can be concluded that SBWR critical power bundle is 10.5 % less than current BWR critical power bundle with length reduction of 12 ft to 9 ft.

  2. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  3. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  4. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  5. Establishment of joint application system of safety analysis codes between Korea and Vietnam

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Park, Cheol; Bae, Sung Won; Baek, Won Pil; Song, Cheol hwa; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Lee, Chang Sup

    2011-04-01

    The following KAERI-VAEI collaboration works have been performed during the 2 year project ('09.4∼'11.4). 1) On the job training of Vietnam code users(1st training for 4 VAEI staff-3 months. 2nd training for 3 VAEI staff- 3 month), 2) Lecture of nuclear safety analysis (30 hrs basic course and 30 hrs advanced course), 3) Review of safety analysis method (IAEA safety concept and requirements), 4) Collaborative assessment of safety analysis code MARS (13 conceptual problem, 2 separate effect test problem, 1 integral effect test problem), 5) Input deck preparation of standard PWR (Preparation of APR1400 input deck and safety analysis of DBA). VAEI staffs have been familiarized to Korean PWR safety assessment technology through the collaboration assessment work using a computer code developed in Korea. The lectures for Vietnamese research will be contributed to the utilization and cultivation of Korean safety technology. The collaborated assessment works will be used for the establishment of MARS based safety analysis system which is independent from US safety assessment system

  6. Establishment of joint application system of safety analysis codes between Korea and Vietnam

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Kim, Kyung Doo; Park, Cheol; Bae, Sung Won; Baek, Won Pil; Song, Cheol hwa; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Lee, Chang Sup [KAERI, Daejeon (Korea, Republic of)

    2011-04-15

    The following KAERI-VAEI collaboration works have been performed during the 2 year project ('09.4{approx}'11.4). 1) On the job training of Vietnam code users(1st training for 4 VAEI staff-3 months. 2nd training for 3 VAEI staff- 3 month), 2) Lecture of nuclear safety analysis (30 hrs basic course and 30 hrs advanced course), 3) Review of safety analysis method (IAEA safety concept and requirements), 4) Collaborative assessment of safety analysis code MARS (13 conceptual problem, 2 separate effect test problem, 1 integral effect test problem), 5) Input deck preparation of standard PWR (Preparation of APR1400 input deck and safety analysis of DBA). VAEI staffs have been familiarized to Korean PWR safety assessment technology through the collaboration assessment work using a computer code developed in Korea. The lectures for Vietnamese research will be contributed to the utilization and cultivation of Korean safety technology. The collaborated assessment works will be used for the establishment of MARS based safety analysis system which is independent from US safety assessment system

  7. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  8. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  9. Development of system of computer codes for severe accident analysis and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H S; Jeon, M H; Cho, N J. and others [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1992-01-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts.

  10. Development of system of computer codes for severe accident analysis and its applications

    International Nuclear Information System (INIS)

    Jang, H. S.; Jeon, M. H.; Cho, N. J. and others

    1992-01-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts

  11. Parallelization of a three-dimensional whole core transport code DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Jin Young, Cho; Han Gyu, Joo; Ha Yong, Kim; Moon-Hee, Chang [Korea Atomic Energy Research Institute, Yuseong-gu, Daejon (Korea, Republic of)

    2003-07-01

    Parallelization of the DeCART (deterministic core analysis based on ray tracing) code is presented that reduces the computational burden of the tremendous computing time and memory required in three-dimensional whole core transport calculations. The parallelization employs the concept of MPI grouping and the MPI/OpenMP mixed scheme as well. Since most of the computing time and memory are used in MOC (method of characteristics) and the multi-group CMFD (coarse mesh finite difference) calculation in DeCART, variables and subroutines related to these two modules are the primary targets for parallelization. Specifically, the ray tracing module was parallelized using a planar domain decomposition scheme and an angular domain decomposition scheme. The parallel performance of the DeCART code is evaluated by solving a rodded variation of the C5G7MOX three dimensional benchmark problem and a simplified three-dimensional SMART PWR core problem. In C5G7MOX problem with 24 CPUs, a speedup of maximum 21 is obtained on an IBM Regatta machine and 22 on a LINUX Cluster in the MOC kernel, which indicates good parallel performance of the DeCART code. In the simplified SMART problem, the memory requirement of about 11 GBytes in the single processor cases reduces to 940 Mbytes with 24 processors, which means that the DeCART code can now solve large core problems with affordable LINUX clusters. (authors)

  12. CATHENA Analysis Of Candu Advanced Passive Moderator Concept In Normal Operation Condition

    International Nuclear Information System (INIS)

    Alfa, Sudjatmi K

    2001-01-01

    In the CANDU - advanced passive moderator (APM) concept, the positive void reactivity is eliminated by reducing the density of the moderator. The simple model for the CANDU APM concept consists of the calandria, heat exchanger, pump, and a stabilizing tank, along with connecting piping. The calandria is divided into two parts, one part simulates the down area, while the other simulates up flow area. To demonstrate the thermalhydraulic behavior of the APM concept, Canadian algorithm for thermalhydraulic network analysis (CATHENA) code is used. The simulation for a pressure boundary condition of 300, 330 and 360 kPa and for water coolant mass flow rate boundary conditions of 2000 and 3000 kg/s respectively have been studied. Preliminary results show that there is boiling in the core, with vapor condensing in the heat exchanger. It is important to note, that the solution had not reached steady state when the boiling occurred

  13. Numerical solution of the thermalhydraulic conservation equations from fundamental concepts to multidimensional two-fluid analysis

    International Nuclear Information System (INIS)

    Carver, M.B.

    1995-08-01

    The discussion briefly establishes some requisite concepts of differential equation theory, and applies these to describe methods for numerical solution of the thermalhydraulic conservation equations in their various forms. The intent is to cover the general methodology without obscuring the principles with details. As a short overview of computational thermalhydraulics, the material provides an introductory foundation, so that those working on the application of thermalhydraulic codes can begin to understand the many intricacies involved without having to locate and read the references given. Those intending to work in code development will need to read and understand all the references. (author). 49 refs

  14. Codes maintained by the LAACG [Los Alamos Accelerator Code Group] at the NMFECC

    International Nuclear Information System (INIS)

    Wallace, R.; Barts, T.

    1990-01-01

    The Los Alamos Accelerator Code Group (LAACG) maintains two groups of design codes at the National Magnetic Fusion Energy Computing Center (NMFECC). These codes, principally electromagnetic field solvers, are used for the analysis and design of electromagnetic components for accelerators, e.g., magnets, rf structures, pickups, etc. In this paper, the status and future of the installed codes will be discussed with emphasis on an experimental version of one set of codes, POISSON/SUPERFISH

  15. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  16. Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments

    Science.gov (United States)

    Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei

    2001-01-01

    This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

  17. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  18. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  19. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  20. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  1. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  2. The Lawless Frontier of Deep Space: Code as Law in EVE Online

    Directory of Open Access Journals (Sweden)

    Melissa de Zwart

    2014-03-01

    Full Text Available This article explores the concepts of player agency with respect to governance and regulation of online games. It considers the unique example of the Council of Stellar Management in EVE Online, and explores the multifaceted role performed by players involved in that Council. In particular, it considers the interaction between code, rules, contracts, and play with respect to EVE Online. This is used as a means to better understand the relations of power generated in game spaces.

  3. Calculation methods for advanced concept light water reactor lattices

    International Nuclear Information System (INIS)

    Carmona, S.

    1986-01-01

    In the last few years s several advanced concepts for fuel rod lattices have been studied. Improved fuel utilization is one of the major aims in the development of new fuel rod designs and lattice modifications. By these changes s better performance in fuel economics s fuel burnup and material endurance can be achieved in the frame of the well-known basic Light Water Reactor technology. Among the new concepts involved in these studies that have attracted serious attention are lattices consisting of arrays of annular rods duplex pellet rods or tight multicells. These new designs of fuel rods and lattices present several computational problems. The treatment of resonance shielded cross sections is a crucial point in the analyses of these advanced concepts . The purpose of this study was to assess adequate approximation methods for calculating as accurately as possible, resonance shielding for these new lattices. Although detailed and exact computational methods for the evaluation of the resonance shielding in these lattices are possible, they are quite inefficient when used in lattice codes. The computer time and memory required for this kind of computations are too large to be used in an acceptable routine manner. In order to over- come these limitations and to make the analyses possible with reasonable use of computer resources s approximation methods are necessary. Usual approximation methods, for the resonance energy regions used in routine lattice computer codes, can not adequately handle the evaluation of these new fuel rod lattices. The main contribution of the present work to advanced lattice concepts is the development of an equivalence principle for the calculation of resonance shielding in the annular fuel pellet zone of duplex pellets; the duplex pellet in this treatment consists of two fuel zones with the same absorber isotope in both regions. In the transition from a single duplex rod to an infinite array of this kind of fuel rods, the similarity of the

  4. Identification of coding and non-coding mutational hotspots in cancer genomes.

    Science.gov (United States)

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  5. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  6. Second-order statistics of colour codes modulate transformations that effectuate varying degrees of scene invariance and illumination invariance.

    Science.gov (United States)

    Mausfeld, Rainer; Andres, Johannes

    2002-01-01

    We argue, from an ethology-inspired perspective, that the internal concepts 'surface colours' and 'illumination colours' are part of the data format of two different representational primitives. Thus, the internal concept of 'colour' is not a unitary one but rather refers to two different types of 'data structure', each with its own proprietary types of parameters and relations. The relation of these representational structures is modulated by a class of parameterised transformations whose effects are mirrored in the idealised computational achievements of illumination invariance of colour codes, on the one hand, and scene invariance, on the other hand. Because the same characteristics of a light array reaching the eye can be physically produced in many different ways, the visual system, then, has to make an 'inference' whether a chromatic deviation of the space-averaged colour codes from the neutral point is due to a 'non-normal', ie chromatic, illumination or due to an imbalanced spectral reflectance composition. We provide evidence that the visual system uses second-order statistics of chromatic codes of a single view of a scene in order to modulate corresponding transformations. In our experiments we used centre surround configurations with inhomogeneous surrounds given by a random structure of overlapping circles, referred to as Seurat configurations. Each family of surrounds has a fixed space-average of colour codes, but differs with respect to the covariance matrix of colour codes of pixels that defines the chromatic variance along some chromatic axis and the covariance between luminance and chromatic channels. We found that dominant wavelengths of red-green equilibrium settings of the infield exhibited a stable and strong dependence on the chromatic variance of the surround. High variances resulted in a tendency towards 'scene invariance', low variances in a tendency towards 'illumination invariance' of the infield.

  7. Learning Illustrated: An Exploratory Cross-Sectional Drawing Analysis of Students' Conceptions of Learning

    Science.gov (United States)

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2018-01-01

    Using the draw-a-picture technique, the authors explored the learning conceptions held by students across grade levels. A total of 1,067 Taiwanese students in Grades 2, 4, 6, 8, 10, and 12 participated in this study. Participants were asked to use drawing to illustrate how they conceptualize learning. A coding checklist was developed to analyze…

  8. Examination of concept of next generation computer. Progress report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Hasegawa, Yukihiro; Hirayama, Toshio

    2000-12-01

    The Center for Promotion of Computational Science and Engineering has conducted R and D works on the technology of parallel processing and has started the examination of the next generation computer in 1999. This report describes the behavior analyses of quantum calculation codes. It also describes the consideration for the analyses and examination results for the method to reduce cash misses. Furthermore, it describes a performance simulator that is being developed to quantitatively examine the concept of the next generation computer. (author)

  9. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  10. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  11. Comparative study of design of piping supports class 1, 2 and 3 considering german code KTA and ASME III - NF

    International Nuclear Information System (INIS)

    Faloppa, Altair A.; Fainer, Gerson; Mattar Neto, Miguel; Elias, Marcos V.

    2013-01-01

    The objective of this paper is developing a comparative study of the design criteria for class 1, 2, 3 piping supports considering the American Code ASME Section III - NF and the German Code KTA 3205.1 to the Primary Circuit, KTA 3205.2 to the others systems and KTA 3205.3 series-production standards supports of a PWR nuclear power plant. An additional purpose of the paper is a general analysis of the main design concepts of the American Code ASME Boiler and Pressure Vessel Code, Section III, Division 1 and German Nuclear Design Code KTA that was performed in order to aid the comparative study proposed. The relevance of this study is to show the differences between codes ASME and KTA since they were applied in the design of the Nuclear Power Plants Angra 1 and Angra 2, and to the design of Angra 3, which is at the moment under construction. It is also considered their use in the design of nuclear installations such as RMB - Reator MultiProposito Brasileiro and LABGENE - Laboratorio de Geracao Nucleoeletrica. (author)

  12. Toward semantic interoperability in home health care: formally representing OASIS items for integration into a concept-oriented terminology.

    Science.gov (United States)

    Choi, Jeungok; Jenkins, Melinda L; Cimino, James J; White, Thomas M; Bakken, Suzanne

    2005-01-01

    The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database.

  13. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  14. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  15. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  16. Facility Targeting, Protection and Mission Decision Making Using the VISAC Code

    Science.gov (United States)

    Morris, Robert H.; Sulfredge, C. David

    2011-01-01

    The Visual Interactive Site Analysis Code (VISAC) has been used by DTRA and several other agencies to aid in targeting facilities and to predict the associated collateral effects for the go, no go mission decision making process. VISAC integrates the three concepts of target geometric modeling, damage assessment capabilities, and an event/fault tree methodology for evaluating accident/incident consequences. It can analyze a variety of accidents/incidents at nuclear or industrial facilities, ranging from simple component sabotage to an attack with military or terrorist weapons. For nuclear facilities, VISAC predicts the facility damage, estimated downtime, amount and timing of any radionuclides released. Used in conjunction with DTRA's HPAC code, VISAC also can analyze transport and dispersion of the radionuclides, levels of contamination of the surrounding area, and the population at risk. VISAC has also been used by the NRC to aid in the development of protective measures for nuclear facilities that may be subjected to attacks by car/truck bombs.

  17. Adaptive Online Sequential ELM for Concept Drift Tackling

    Directory of Open Access Journals (Sweden)

    Arif Budiman

    2016-01-01

    Full Text Available A machine learning method needs to adapt to over time changes in the environment. Such changes are known as concept drift. In this paper, we propose concept drift tackling method as an enhancement of Online Sequential Extreme Learning Machine (OS-ELM and Constructive Enhancement OS-ELM (CEOS-ELM by adding adaptive capability for classification and regression problem. The scheme is named as adaptive OS-ELM (AOS-ELM. It is a single classifier scheme that works well to handle real drift, virtual drift, and hybrid drift. The AOS-ELM also works well for sudden drift and recurrent context change type. The scheme is a simple unified method implemented in simple lines of code. We evaluated AOS-ELM on regression and classification problem by using concept drift public data set (SEA and STAGGER and other public data sets such as MNIST, USPS, and IDS. Experiments show that our method gives higher kappa value compared to the multiclassifier ELM ensemble. Even though AOS-ELM in practice does not need hidden nodes increase, we address some issues related to the increasing of the hidden nodes such as error condition and rank values. We propose taking the rank of the pseudoinverse matrix as an indicator parameter to detect “underfitting” condition.

  18. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  19. Material report in support to RCC-MRX code 2010 stainless steel parts and products

    International Nuclear Information System (INIS)

    Ancelet, Olivier; Lebarbe, Thierry; Dubiez-Le Goff, Sophie; Bonne, Dominique; Gelineau, Odile

    2012-01-01

    This paper presents the Material Report dedicated to stainless steels parts and products issued by AFCEN (Association Francaise pour les regles de Conception et de Construction des Materiels des Chaudieres Electro-Nucleaires) in support to RCC-MRx 2010 Code. The RCC-MRx Code is the result of the merger of the RCC-MX 2008, developed in the context of the research reactor Jules Horowitz Reactor project, in the RCC-MR 2007, which set up rules applicable to the design of components operating at high temperature and to the Vacuum Vessel of ITER (a presentation of RCC-MRx 2010 Code is the subject of another paper proposed in this Congress; it explains in particular the status of this Code). This Material Report is part of a set of Criteria of RCC-MRx (this set of Criteria is under construction). The Criteria aim at explaining the design and construction rules of the Code. They cover analyses rules as well as part procurement, welding, methods of tests and examination and fabrication rules. The Material Report particularly provides justifications and explanations on requirements and features dealing with parts and products proposed in the Code. The Material Report contains the following information: Introduction of the grade(s): codes and standards and Reference Procurement Specifications covering parts and products, applications and experience gained, - Physical properties, - Mechanical properties used for design calculations (base metal and welds): basic mechanical properties, creep mechanical properties, irradiated mechanical properties, - Fabrication: experience gained, metallurgy, - Welding: weldability, experience gained during welding and repair procedure qualifications, - Non-destructive examination, - In-service behaviour. In the article, examples of data supplied in the Material Report dedicated to stainless steels will be exposed. (authors)

  20. Utility experience in code updating of equipment built to 1974 code, Section 3, Subsection NF

    International Nuclear Information System (INIS)

    Rao, K.R.; Deshpande, N.

    1990-01-01

    This paper addresses changes to ASME Code Subsection NF and reconciles the differences between the updated codes and the as built construction code, of ASME Section III, 1974 to which several nuclear plants have been built. Since Section III is revised every three years and replacement parts complying with the construction code are invariably not available from the plant stock inventory, parts must be procured from vendors who comply with the requirements of the latest codes. Aspects of the ASME code which reflect Subsection NF are identified and compared with the later Code editions and addenda, especially up to and including the 1974 ASME code used as the basis for the plant qualification. The concern of the regulatory agencies is that if later code allowables and provisions are adopted it is possible to reduce the safety margins of the construction code. Areas of concern are highlighted and the specific changes of later codes are discerned; adoption of which, would not sacrifice the intended safety margins of the codes to which plants are licensed