WorldWideScience

Sample records for code specific concepts

  1. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  2. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  3. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  4. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  5. UEP Concepts in Modulation and Coding

    Directory of Open Access Journals (Sweden)

    Werner Henkel

    2010-01-01

    Full Text Available First unequal error protection (UEP proposals date back to the 1960's (Masnick and Wolf; 1967, but now with the introduction of scalable video, UEP develops to a key concept for the transport of multimedia data. The paper presents an overview of some new approaches realizing UEP properties in physical transport, especially multicarrier modulation, or with LDPC and Turbo codes. For multicarrier modulation, UEP bit-loading together with hierarchical modulation is described allowing for an arbitrary number of classes, arbitrary SNR margins between the classes, and arbitrary number of bits per class. In Turbo coding, pruning, as a counterpart of puncturing is presented for flexible bit-rate adaptations, including tables with optimized pruning patterns. Bit- and/or check-irregular LDPC codes may be designed to provide UEP to its code bits. However, irregular degree distributions alone do not ensure UEP, and other necessary properties of the parity-check matrix for providing UEP are also pointed out. Pruning is also the means for constructing variable-rate LDPC codes for UEP, especially controlling the check-node profile.

  6. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  7. Development of FBR integrity system code. Basic concept

    International Nuclear Information System (INIS)

    Asayama, Tai

    2001-05-01

    For fast breeder reactors to be commercialized, they must be more reliable, safer, and at the same, economically competitive with future light water reactors. Innovation of elevated temperature structural design standard is necessary to achieve this goal. The most powerful way is to enlarge the scope of structural integrity code to cover items other than design evaluation that has been addressed in existing codes. Items that must be newly covered are prerequisites of design, fabrication, examination, operation and maintenance, etc. This allows designers to choose the most economical combination of design variations to achieve specific reliability that is needed for a particular component. Designing components by this concept, a cost-minimum design of a whole plant can be realized. By determining the reliability that must be achieved for a component by risk technologies, further economical improvement can be expected by avoiding excessive quality. Recognizing the necessity for the codes based on the new concept, the development of 'FBR integrity system code' began in 2000. Research and development will last 10 years. For this development, the basic logistics and system as well as technologies that materialize the concept are necessary. Original logistics and system must be developed, because no existing researches are available in and out of Japan. This reports presents the results of the work done in the first year regarding the basic idea, methodology, and structure of the code. (author)

  8. Exploring the concept of QR Code and the benefits of using QR Code for companies

    OpenAIRE

    Ji, Qianyu

    2014-01-01

    This research work concentrates on the concept of QR Code and the benefits of using QR Code for companies. The first objective of this research work is to study the general information of QR Code in order to guide people to understand the QR Code in detail. The second objective of this research work is to explore and analyze the essential and feasible technologies of QR Code for the sake of clearing the technologies of QR code. Additionally, this research work through QR Code best practices t...

  9. A computer code for Tokamak reactor concepts evaluation

    International Nuclear Information System (INIS)

    Rosatelli, F.; Raia, G.

    1985-01-01

    A computer package has been developed which could preliminarily investigate the engineering configuration of a tokamak reactor concept. The code is essentially intended to synthesize, starting from a set of geometrical and plasma physics parameters and the required performances and objectives, three fundamental components of a tokamak reactor core: blanket+shield, TF magnet, PF magnet. An iterative evaluation of the size, power supply and cooling system requirements of these components allows the judgment and the preliminary design optimization on the considered reactor concept. The versatility of the code allows its application both to next generation tokamak devices and power reactor concepts

  10. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-03-13

    In this work we have developed a restructuring software tool (RT-CUDA) following the proposed optimization specifications to bridge the gap between high-level languages and the machine dependent CUDA environment. RT-CUDA takes a C program and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design of linear algebra solvers. We expect RT-CUDA to be needed by many KSA industries dealing with science and engineering simulation on massively parallel computers like NVIDIA GPUs.

  11. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  12. Cost Concept Model and Gateway Specification

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    This document introduces a Framework supporting the implementation of a cost concept model against which current and future cost models for curating digital assets can be benchmarked. The value built into this cost concept model leverages the comprehensive engagement by the 4C project with various...... to promote interoperability; • A Nested Model for Digital Curation—that visualises the core concepts, demonstrates how they interact and places them into context visually by linking them to A Cost and Benefit Model for Curation; This Framework provides guidance for data collection and associated calculations...

  13. Mother code specifications (Appendix to CEA report 2472)

    International Nuclear Information System (INIS)

    Pillard, Denise; Soule, Jean-Louis

    1964-12-01

    The Mother code (written in Fortran for IBM 7094) computes the integral cross section and the first two moments of energy transfer of a thermalizer. Computation organisation and methods are presented in an other document. This document presents code specifications, i.e. input data (for spectrum description, printing options, input record formats, conditions to be met by values), and results (printing formats and options, writing and punching options and formats)

  14. Implementation of probabilistic safety concepts in international codes

    International Nuclear Information System (INIS)

    Borges, J.F.

    1977-01-01

    Recent progress in the implementation of safety concepts in international structure codes is briefly presented. Special attention is paid to the work of the Joint-Committee on Structural Safety. The discussion is centered on some problems such as: safety differentiation, definition and combination of actions, spaces for checking safety and non-linear structural behaviour. When discussing safety differentiation it should be considered that the total probability of failure derives from a theoretical probability of failure and a probability of failure due to error and gross negligence. Optimization of design criteria should take into account both causes of failure. The quantification of reliability implies a probabilistic idealization of all basic variables. Steps taken to obtain an improved definition of different types of actions and rules for their combination are described. Safety checking can be carried out in terms of basic variables, action-effects, or any other suitable variable. However, the advantages and disadvantages of the different types of formulation should be discussed, particularly in the case of non-linear structural behaviour. (orig.) [de

  15. Basic concept of common reactor physics code systems. Final report of working party on common reactor physics code systems (CCS)

    International Nuclear Information System (INIS)

    2004-03-01

    A working party was organized for two years (2001-2002) on common reactor physics code systems under the Research Committee on Reactor Physics of JAERI. This final report is compilation of activity of the working party on common reactor physics code systems during two years. Objectives of the working party is to clarify basic concept of common reactor physics code systems to improve convenience of reactor physics code systems for reactor physics researchers in Japan on their various field of research and development activities. We have held four meetings during 2 years, investigated status of reactor physics code systems and innovative software technologies, and discussed basic concept of common reactor physics code systems. (author)

  16. Formal specification level concepts, methods, and algorithms

    CERN Document Server

    Soeken, Mathias

    2015-01-01

    This book introduces a new level of abstraction that closes the gap between the textual specification of embedded systems and the executable model at the Electronic System Level (ESL). Readers will be enabled to operate at this new, Formal Specification Level (FSL), using models which not only allow significant verification tasks in this early stage of the design flow, but also can be extracted semi-automatically from the textual specification in an interactive manner.  The authors explain how to use these verification tasks to check conceptual properties, e.g. whether requirements are in conflict, as well as dynamic behavior, in terms of execution traces. • Serves as a single-source reference to a new level of abstraction for embedded systems, known as the Formal Specification Level (FSL); • Provides a variety of use cases which can be adapted to readers’ specific design flows; • Includes a comprehensive illustration of Natural Language Processing (NLP) techniques, along with examples of how to i...

  17. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    Science.gov (United States)

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  18. Interoperable domain-specific languages families for code generation

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Plášil, F.; Bureš, Tomáš; Hnětynka, P.

    2013-01-01

    Roč. 43, č. 5 (2013), s. 479-499 ISSN 0038-0644 R&D Projects: GA ČR GD201/09/H057 EU Projects: European Commission(XE) ASCENS 257414 Grant - others:GA AV ČR(CZ) GAP103/11/1489 Program:FP7 Institutional research plan: CEZ:AV0Z10300504 Keywords : code generation * domain specific languages * models reuse * extensible languages * specification * program synthesis Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.148, year: 2013

  19. Building a dynamic code to simulate new reactor concepts

    International Nuclear Information System (INIS)

    Catsaros, N.; Gaveau, B.; Jaekel, M.-T.; Maillard, J.; Maurel, G.; Savva, P.; Silva, J.; Varvayanni, M.

    2012-01-01

    Highlights: ► We develop a stochastic neutronic code based on an existing High Energy Physics code. ► The code simulates innovative reactor designs including Accelerator Driven Systems. ► Core materials evolution will be dynamically simulated, including fuel burnup. ► Continuous feedback between the main inter-related parameters will be established. ► A description of the current research development and achievements is also given. - Abstract: Innovative nuclear reactor designs have been proposed, such as the Accelerator Driven Systems (ADSs), the “candle” reactors, etc. These reactor designs introduce computational nuclear technology problems the solution of which necessitates a new, global and dynamic computational approach of the system. A continuous feedback procedure must be established between the main inter-related parameters of the system such as the chemical, physical and isotopic composition of the core, the neutron flux distribution and the temperature field. Furthermore, as far as ADSs are concerned, the ability of the computational tool to simulate the nuclear cascade created from the interaction of accelerated protons with the spallation target as well as the produced neutrons, is also required. The new Monte Carlo code ANET (Advanced Neutronics with Evolution and Thermal hydraulic feedback) is being developed based on the GEANT3 High Energy Physics code, aiming to progressively satisfy all the above requirements. A description of the capabilities and methodologies implemented in the present version of ANET is given here, together with some illustrative applications of the code.

  20. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  1. Domain Specificity between Peer Support and Self-Concept

    Science.gov (United States)

    Leung, Kim Chau; Marsh, Herbert W.; Craven, Rhonda G.; Yeung, Alexander S.; Abduljabbar, Adel S.

    2013-01-01

    Peer support interventions have mostly neglected the domain specificity of intervention effects. In two studies, the present investigation examined the domain specificity of peer support interventions targeting specific domains of self-concept. In Study 1, participants ("n" = 50) who had received an academically oriented peer support…

  2. Specification process reengineering: concepts and experiences from Danish industry

    DEFF Research Database (Denmark)

    Hansen, Benjamin Loer; Riis, Jesper; Hvam, Lars

    2003-01-01

    This paper presents terminologies and concepts related to the IT automation of specification processes in companies manufacturing custom made products. Based on 11 cases from the Danish industry the most significant development trends are discussed.......This paper presents terminologies and concepts related to the IT automation of specification processes in companies manufacturing custom made products. Based on 11 cases from the Danish industry the most significant development trends are discussed....

  3. Code-specific learning rules improve action selection by populations of spiking neurons.

    Science.gov (United States)

    Friedrich, Johannes; Urbanczik, Robert; Senn, Walter

    2014-08-01

    Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.

  4. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  5. Assigning clinical codes with data-driven concept representation on Dutch clinical free text.

    Science.gov (United States)

    Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Goethals, Bart; Daelemans, Walter

    2017-05-01

    Clinical codes are used for public reporting purposes, are fundamental to determining public financing for hospitals, and form the basis for reimbursement claims to insurance providers. They are assigned to a patient stay to reflect the diagnosis and performed procedures during that stay. This paper aims to enrich algorithms for automated clinical coding by taking a data-driven approach and by using unsupervised and semi-supervised techniques for the extraction of multi-word expressions that convey a generalisable medical meaning (referred to as concepts). Several methods for extracting concepts from text are compared, two of which are constructed from a large unannotated corpus of clinical free text. A distributional semantic model (i.c. the word2vec skip-gram model) is used to generalize over concepts and retrieve relations between them. These methods are validated on three sets of patient stay data, in the disease areas of urology, cardiology, and gastroenterology. The datasets are in Dutch, which introduces a limitation on available concept definitions from expert-based ontologies (e.g. UMLS). The results show that when expert-based knowledge in ontologies is unavailable, concepts derived from raw clinical texts are a reliable alternative. Both concepts derived from raw clinical texts perform and concepts derived from expert-created dictionaries outperform a bag-of-words approach in clinical code assignment. Adding features based on tokens that appear in a semantically similar context has a positive influence for predicting diagnostic codes. Furthermore, the experiments indicate that a distributional semantics model can find relations between semantically related concepts in texts but also introduces erroneous and redundant relations, which can undermine clinical coding performance. Copyright © 2017. Published by Elsevier Inc.

  6. A regulatory code for neuron-specific odor receptor expression.

    Directory of Open Access Journals (Sweden)

    Anandasankar Ray

    2008-05-01

    Full Text Available Olfactory receptor neurons (ORNs must select-from a large repertoire-which odor receptors to express. In Drosophila, most ORNs express one of 60 Or genes, and most Or genes are expressed in a single ORN class in a process that produces a stereotyped receptor-to-neuron map. The construction of this map poses a problem of receptor gene regulation that is remarkable in its dimension and about which little is known. By using a phylogenetic approach and the genome sequences of 12 Drosophila species, we systematically identified regulatory elements that are evolutionarily conserved and specific for individual Or genes of the maxillary palp. Genetic analysis of these elements supports a model in which each receptor gene contains a zip code, consisting of elements that act positively to promote expression in a subset of ORN classes, and elements that restrict expression to a single ORN class. We identified a transcription factor, Scalloped, that mediates repression. Some elements are used in other chemosensory organs, and some are conserved upstream of axon-guidance genes. Surprisingly, the odor response spectra and organization of maxillary palp ORNs have been extremely well-conserved for tens of millions of years, even though the amino acid sequences of the receptors are not highly conserved. These results, taken together, define the logic by which individual ORNs in the maxillary palp select which odor receptors to express.

  7. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  8. Software requirements specification document for the AREST code development

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.; Whitney, P.D.; Gray, W.J.; Williford, R.E.; White, M.D.; Eslinger, P.W.; Altenhofen, M.K.

    1993-11-01

    The Analysis of the Repository Source Term (AREST) computer code was selected in 1992 by the U.S. Department of Energy. The AREST code will be used to analyze the performance of an underground high level nuclear waste repository. The AREST code is being modified by the Pacific Northwest Laboratory (PNL) in order to evaluate the engineered barrier and waste package designs, model regulatory compliance, analyze sensitivities, and support total systems performance assessment modeling. The current version of the AREST code was developed to be a very useful tool for analyzing model uncertainties and sensitivities to input parameters. The code has also been used successfully in supplying source-terms that were used in a total systems performance assessment. The current version, however, has been found to be inadequate for the comparison and selection of a design for the waste package. This is due to the assumptions and simplifications made in the selection of the process and system models. Thus, the new version of the AREST code will be designed to focus on the details of the individual processes and implementation of more realistic models. This document describes the requirements of the new models that will be implemented. Included in this document is a section describing the near-field environmental conditions for this waste package modeling, description of the new process models that will be implemented, and a description of the computer requirements for the new version of the AREST code

  9. Design specifications for ASME B and PV Code Section III nuclear class 1 piping

    International Nuclear Information System (INIS)

    Richardson, J.A.

    1978-01-01

    ASME B and PV Code Section III code regulations for nuclear piping requires that a comprehensive Design Specification be developed for ensuring that the design and installation of the piping meets all code requirements. The intent of this paper is to describe the code requirements, discuss the implementation of these requirements in a typical Class 1 piping design specification, and to report on recent piping failures in operating light water nuclear power plants in the US. (author)

  10. Code Shift: Grid Specifications and Dynamic Wind Turbine Models

    DEFF Research Database (Denmark)

    Ackermann, Thomas; Ellis, Abraham; Fortmann, Jens

    2013-01-01

    Grid codes (GCs) and dynamic wind turbine (WT) models are key tools to allow increasing renewable energy penetration without challenging security of supply. In this article, the state of the art and the further development of both tools are discussed, focusing on the European and North American e...

  11. Formal Methods for Abstract Specifications – A Comparison of Concepts

    DEFF Research Database (Denmark)

    Instenberg, Martin; Schneider, Axel; Schnetter, Sabine

    2006-01-01

    In industry formal methods are becoming increasingly important for the verification of hardware and software designs. However current practice for specification of system and protocol functionality on high level of abstraction is textual description. For verification of the system behavior manual...... inspections and tests are usual means. To facilitate the introduction of formal methods in the development process of complex systems and protocols, two different tools evolved from research activities – UPPAAL and SpecEdit – have been investigated and compared regarding their concepts and functionality...

  12. Low Specific Activity materials concepts are being reevaluated

    International Nuclear Information System (INIS)

    Rawl, R.R.

    1993-01-01

    Many types of radioactive low-level waste are classified, packaged, and transported as Low-Specific Activity (LSA) material. The transportation regulations allow LSA materials to be shipped in economical packagings and, under certain conditions, waives compliance with other detailed requirements such as labeling. The fundamental concepts which support the LSA category are being thoroughly reevaluated to determine the defensibility of the provisions. A series of national and international events are leading to the development of new dose models which are likely to fundamentally change the ways these materials are defined. Similar basis changes are likely for the packaging requirements applicable to these materials

  13. Concept for Specific Lines of Business, Energy Saving Tourism

    International Nuclear Information System (INIS)

    Jilek, W.

    1998-01-01

    In the spirit of the objectives of the Energy Plan 1995 in order to make more efficient use of energy and thus to reduce energy requirements, to promote the use of renewable energies, and to attach maximum importance to the ecological compatibility of the energy systems, among other project the provincial government of Styria is pursuing the option of consulting small and medium-sized enterprises in a target manner. Three years after being launched, this Ecological Company Consulting scheme for various lines of business is now producing successful results, demonstrating that energy saving, business profit and ecology can go hand in hand by example of numerous pilot projects. Trade-specific concepts have been elaborated for foodstuffs, carpenters and car repair and sales firms, bakeries and hairdressers and, most recently, for tourist industry business /hotels, bars, restaurants, etc.). The province of Styria, represented by the Energy Commissioner and the department of waste management, is co-operating closely in the Ecological Company Consulting scheme with the Styrian Chamber of Commerce and the Economy Promotion Institute (Wirtschaftsfoerderungsinstitut). In several cases, other provinces, the Federal Ministry of Environmental, Youth and Family Affairs, and the Federal Chamber of Commerce have adopted the results of this co-operation, while in some cases subsidy schemes are linked to these trade-specific concepts. In the course of the scheme, the aim is to investigate energy requirements, saving potentials and questions of waste management. (author)

  14. Experiment-specific analyses in support of code development

    International Nuclear Information System (INIS)

    Ott, L.J.

    1990-01-01

    Experiment-specific models have been developed since 1986 by Oak Ridge National Laboratory Boiling Water Reactor (BWR) severe accident analysis programs for the purpose of BWR experimental planning and optimum interpretation of experimental results. These experiment-specific models have been applied to large integral tests (ergo, experiments) which start from an initial undamaged core state. The tests performed to date in BWR geometry have had significantly different-from-prototypic boundary and experimental conditions because of either normal facility limitations or specific experimental constraints. These experiments (ACRR: DF-4, NRU: FLHT-6, and CORA) were designed to obtain specific phenomenological information such as the degradation and interaction of prototypic components and the effects on melt progression of control-blade materials and channel boxes. Applications of ORNL models specific to the ACRR DF-4 and KfK CORA-16 experiments are discussed and significant findings from the experimental analyses are presented. 32 refs., 16 figs

  15. Symmetries in Genetic Systems and the Concept of Geno-Logical Coding

    Directory of Open Access Journals (Sweden)

    Sergey V. Petoukhov

    2016-12-01

    Full Text Available The genetic code of amino acid sequences in proteins does not allow understanding and modeling of inherited processes such as inborn coordinated motions of living bodies, innate principles of sensory information processing, quasi-holographic properties, etc. To be able to model these phenomena, the concept of geno-logical coding, which is connected with logical functions and Boolean algebra, is put forward. The article describes basic pieces of evidence in favor of the existence of the geno-logical code, which exists in p­arallel with the known genetic code of amino acid sequences but which serves for transferring inherited processes along chains of generations. These pieces of evidence have been received due to the analysis of symmetries in structures of molecular-genetic systems. The analysis has revealed a close connection of the genetic system with dyadic groups of binary numbers and with other mathematical objects, which are related with dyadic groups: Walsh functions (which are algebraic characters of dyadic groups, bit-reversal permutations, logical holography, etc. These results provide a new approach for mathematical modeling of genetic structures, which uses known mathematical formalisms from technological fields of noise-immunity coding of information, binary analysis, logical holography, and digital devices of artificial intellect. Some opportunities for a development of algebraic-logical biology are opened.

  16. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  17. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  18. On the concept of elasticity used in some fast reactor accident analysis codes

    International Nuclear Information System (INIS)

    Malmberg, T.

    1975-01-01

    The analysis to be presented will restrict attention to the elastic part of the elastic-plastic constitutive equation used in several Fast Reactor Accident Analysis Codes and originally applied by M.L. Wilkins: Calculation of Elastic-Plastic Flow, UCRL-7322, Rev. 1, Jan. 1969. It is shown that the used elasticity concept is within the frame of hypo-elasticity. On the basis of a test found by Bernstein it is proven that the state of stress is generally depending on the path of deformation. Therefore this concept of elasticity is not compatible with finite elasticity. For several simple deformation processes this special hypo-elastic constitutive equation is integrated to give a stress-strain relation. The path-dependence of this relation is demonstrated. Further the phenomenon of hypo-elastic yield under shear deformation is pointed out. The relevance to modelling material behaviour in primary containment analysis is discussed

  19. On the concept of elasticity used in some fast reactor accident analysis codes

    International Nuclear Information System (INIS)

    Malmberg, T.

    1975-01-01

    The analysis presented restricts attention to the elastic part of the elastic-plastic equation used in several Fast Reactor Accident Analysis Codes and originally applied by M.L. Wilkins: Calculation of Elastic-Plastic Flow, UCRL-7322, Rev. 1, Jan 1969. It is shown that the used elasticity concept is within the frame of hypo-elasticity. On the basis of a test found by Bernstein it is proven that the state of stress is generally depending on the path of deformation. Therefore this concept of elasticity is not compatible with finite elasticity. For several deformation processes this special hypo-elastic constitutive equation is integrated to give a stress-strain relation. The path-dependence of this relation is demonstrated. Further the phenomenon of hypo-elastic yield under shear deformation is pointed out. The relevance to modelling material behaviour in primary containment analysis is discussed. (Auth.)

  20. Contribution to the design and realisation of a specific circuit to code the information coming from the calorimeter of the LHC; Contribution a la conception et a la realisation d`un circuit specifique de codage des informations issues du calorimetre d`une experience aupres du LHC

    Energy Technology Data Exchange (ETDEWEB)

    Chambert-Hermel, V

    1996-09-19

    LHC (Large Hadron Collider) signals required a sampling system with excess of 16 bit dynamic range and 8 hit precision. The sampling frequency is 40 MHz. The use of a floating point format which fits the precision of the calorimeter is proposed. The dynamic range is divided into 8 positive sub-ranges and 5 negative ones and so a conversion into 8 plus 1 (sign) bits mantissa and 4 bits exponent is proposed. The design is built around three main blocks: a range converter which computes the three exponent bits and the sign, a set of amplifiers controlled by the range converter and a classical 8 bit ADC for the mantissa. The main effort was concentrated on the range converter as this is the most sensitive part o the architecture which sees the whole dynamic range. To minimize the problems of perturbations on the signal and reference lines, we have chosen a fully differential sample and hold, differential latched comparators and the coding logic using the AMS BICMOS 1..2 micron technology. We present the floating point format we use, the converter architecture, the elementary circuits steps of conception, the simulation results, the layout and tests results on prototypes. (author) 17 refs.

  1. Disease-Specific Trends of Comorbidity Coding and Implications for Risk Adjustment in Hospital Administrative Data.

    Science.gov (United States)

    Nimptsch, Ulrike

    2016-06-01

    To investigate changes in comorbidity coding after the introduction of diagnosis related groups (DRGs) based prospective payment and whether trends differ regarding specific comorbidities. Nationwide administrative data (DRG statistics) from German acute care hospitals from 2005 to 2012. Observational study to analyze trends in comorbidity coding in patients hospitalized for common primary diseases and the effects on comorbidity-related risk of in-hospital death. Comorbidity coding was operationalized by Elixhauser diagnosis groups. The analyses focused on adult patients hospitalized for the primary diseases of heart failure, stroke, and pneumonia, as well as hip fracture. When focusing the total frequency of diagnosis groups per record, an increase in depth of coding was observed. Between-hospital variations in depth of coding were present throughout the observation period. Specific comorbidity increases were observed in 15 of the 31 diagnosis groups, and decreases in comorbidity were observed for 11 groups. In patients hospitalized for heart failure, shifts of comorbidity-related risk of in-hospital death occurred in nine diagnosis groups, in which eight groups were directed toward the null. Comorbidity-adjusted outcomes in longitudinal administrative data analyses may be biased by nonconstant risk over time, changes in completeness of coding, and between-hospital variations in coding. Accounting for such issues is important when the respective observation period coincides with changes in the reimbursement system or other conditions that are likely to alter clinical coding practice. © Health Research and Educational Trust.

  2. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  3. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  4. The Ductile Design Concept for Seismic Actions in Miscellaneous Design Codes

    Directory of Open Access Journals (Sweden)

    M. Budescu

    2009-01-01

    Full Text Available The concept of ductility estimates the capacity of the structural system and its components to deform prior to collapse, without a substantial loss of strength, but with an important energy amount dissipated. Consistent with the „Applied Technology Council” (ATC-34, from 1995, it was agreed that the reduction seismic response factor to decrease the design force. The purpose of this factor is to transpose the nonlinear behaviour of the structure and the energy dissipation capacity in a simplified form that can be used in the design stage. Depending on the particular structural model and the design standard the used values are different. The paper presents the characteristics of the ductility concept for the structural system. Along with this the general way of computing the reserve factor with the necessary explanations for the parameters that determine the behaviour factor are described. The purpose of this paper is to make a comparison between different international norms for the values and the distribution of the behaviour factor. The norms from the following countries are taken into consideration: the United States of America, New Zealand, Japan, Romania and the European general seismic code.

  5. The key to technical translation, v.1 concept specification

    CERN Document Server

    Hann, Michael

    1992-01-01

    This handbook for German/English/German technical translators at all levels from student to professional covers the root terminologies of the spectrum of scientific and engineering fields. The work is designed to give technical translators direct insight into the main error sources occurring in their profession, especially those resulting from a poor understanding of the subject matter and the usage of particular terms to designate different concepts in different branches of technology. The style is easy to read and suitable for nonnative English speakers and translators with no engineering ex

  6. Domain-Specific Self-Concept in Relation to Traditional and Cyber Peer Aggression

    Science.gov (United States)

    Toledano, Shanee; Werch, Brittany L.; Wiens, Brenda A.

    2015-01-01

    Individuals who aggress against others have been described both as having overall low self-concept and as having high, inflated self-concept. The conceptualization of self-concept as domain specific provides an alternate means to resolving this controversy. In this study, 223 middle school students completed self-report measures assessing…

  7. Restrictive concept of surrogacy in the draft text of the Civil Code of Serbia

    Directory of Open Access Journals (Sweden)

    Bordaš Bernadet I.

    2015-01-01

    Full Text Available The working draft of the Civil Code of Serbia, which was published in June 2015, includes model-provisions on surrogate motherhood, which is, at present, expressly prohibited by law. The paper gives a survey of the proposed provisions and examines particularly those that define which persons can conclude a contract on surrogacy. By limiting this right to persons holding the nationality of Serbia, or to these nationals and persons residing in the territory of Serbia for at least three (five years the legislator wish to avoid reproductive tourism. Surrogate mothering with cross-border effects gives rise to complicated legal problems as regards the definition and recognition of legal parentage of the intended parents both in the countries in which the surrogate mother gives birth to the child, as well as in countries in which the intended parents wish to live with their child. The restrictive concept which retains surrogate mothering within the borders of the domestic state and between domestic nationals disables outgoing cases of surrogate motherhood, but it is not quite true for persons who are not citizens of Serbia, but living on its territory. For these reasons the paper critically examines these limitations in the proposals, and indicates that the incoming cases of surrogate motherhood cannot be prevented due to the free movement of people. The paper also provides analysis of the legal issues of the incoming cases of surrogate motherhood, and suggests solution for them if in the future Civil Code the proposed ipso jure legal parenthood of intended parents will be adopted. With ipso jure legal parenthood of a child who is born to a surrogate mother abroad there is no need to restrict surrogacy cases to nationals of Serbia or to foreigners domiciled in Serbia for three (five years minimum.

  8. Details and justifications for the MAP concept specification for acceleration above 63 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Berg, J. Scott [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.

    2014-02-28

    The Muon Accelerator Program (MAP) requires a concept specification for each of the accelerator systems. The Muon accelerators will bring the beam energy from a total energy of 63 GeV to the maximum energy that will fit on the Fermilab site. Justifications and supporting references are included, providing more detail than will appear in the concept specification itself.

  9. The computer code SEURBNUK/EURDYN (Release 1). Input and output specification

    International Nuclear Information System (INIS)

    Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK/EURODYN is an extension of SEURBNUK-2, a two dimensional, axisymmetric, Eulerian, finite element containment code in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin and thick structures. These codes are designed to model the hydrodynamic development in time of a hypothetical core disruptive accident (HCDA) in a fast breeder reactor. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations, with information on output facilities, and aid to users to avoid some common difficulties. (UK)

  10. How Does Creating a Concept Map Affect Item-Specific Encoding?

    Science.gov (United States)

    Grimaldi, Phillip J.; Poston, Laurel; Karpicke, Jeffrey D.

    2015-01-01

    Concept mapping has become a popular learning tool. However, the processes underlying the task are poorly understood. In the present study, we examined the effect of creating a concept map on the processing of item-specific information. In 2 experiments, subjects learned categorized or ad hoc word lists by making pleasantness ratings, sorting…

  11. Fostering Self-Concept and Interest for Statistics through Specific Learning Environments

    Science.gov (United States)

    Sproesser, Ute; Engel, Joachim; Kuntze, Sebastian

    2016-01-01

    Supporting motivational variables such as self-concept or interest is an important goal of schooling as they relate to learning and achievement. In this study, we investigated whether specific interest and self-concept related to the domains of statistics and mathematics can be fostered through a four-lesson intervention focusing on statistics.…

  12. Design implications for task-specific search utilities for retrieval and re-engineering of code

    Science.gov (United States)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  13. A human-specific de novo protein-coding gene associated with human brain functions.

    Directory of Open Access Journals (Sweden)

    Chuan-Yun Li

    2010-03-01

    Full Text Available To understand whether any human-specific new genes may be associated with human brain functions, we computationally screened the genetic vulnerable factors identified through Genome-Wide Association Studies and linkage analyses of nicotine addiction and found one human-specific de novo protein-coding gene, FLJ33706 (alternative gene symbol C20orf203. Cross-species analysis revealed interesting evolutionary paths of how this gene had originated from noncoding DNA sequences: insertion of repeat elements especially Alu contributed to the formation of the first coding exon and six standard splice junctions on the branch leading to humans and chimpanzees, and two subsequent substitutions in the human lineage escaped two stop codons and created an open reading frame of 194 amino acids. We experimentally verified FLJ33706's mRNA and protein expression in the brain. Real-Time PCR in multiple tissues demonstrated that FLJ33706 was most abundantly expressed in brain. Human polymorphism data suggested that FLJ33706 encodes a protein under purifying selection. A specifically designed antibody detected its protein expression across human cortex, cerebellum and midbrain. Immunohistochemistry study in normal human brain cortex revealed the localization of FLJ33706 protein in neurons. Elevated expressions of FLJ33706 were detected in Alzheimer's brain samples, suggesting the role of this novel gene in human-specific pathogenesis of Alzheimer's disease. FLJ33706 provided the strongest evidence so far that human-specific de novo genes can have protein-coding potential and differential protein expression, and be involved in human brain functions.

  14. STEEP4 code for computation of specific thermonuclear reaction rates from pointwise cross sections

    International Nuclear Information System (INIS)

    Harris, D.R.; Dei, D.E.; Husseiny, A.A.; Sabri, Z.A.; Hale, G.M.

    1976-05-01

    A code module, STEEP4, is developed to calculate the fusion reaction rates in terms of the specific reactivity [sigma v] which is the product of cross section and relative velocity averaged over the actual ion distributions of the interacting particles in the plasma. The module is structured in a way suitable for incorporation in thermonuclear burn codes to provide rapid and yet relatively accurate on-line computation of [sigma v] as a function of plasma parameters. Ion distributions are modified to include slowing-down contributions which are characterized in terms of plasma parameters. Rapid and accurate algorithms are used for integrating [sigma v] from cross sections and spectra. The main program solves for [sigma v] by the method of steepest descent. However, options are provided to use Gauss-Hermite and dense trapezoidal quadrature integration techniques. Options are also provided for rapid calculation of screening effects on specific reaction rates. Although such effects are not significant in cases of plasmas of laboratory interest, the options are included to increase the range of applicability of the code. Gamow penetration form, log-log interpolation, and cubic interpolation routines are included to provide the interpolated values of cross sections

  15. A new coding concept for fast ultrasound imaging using pulse trains

    DEFF Research Database (Denmark)

    Misaridis, T.; Jensen, Jørgen Arendt

    2002-01-01

    Frame rate in ultrasound imaging can he increased by simultaneous transmission of multiple beams using coded waveforms. However, the achievable degree of orthogonality among coded waveforms is limited in ultrasound, and the image quality degrades unacceptably due to interbeam interference....... In this paper, an alternative combined time-space coding approach is undertaken. In the new method all transducer elements are excited with short pulses and the high time-bandwidth (TB) product waveforms are generated acoustically. Each element transmits a short pulse spherical wave with a constant transmit...... delay from element to element, long enough to assure no pulse overlapping for all depths in the image. Frequency shift keying is used for "per element" coding. The received signals from a point scatterer are staggered pulse trains which are beamformed for all beam directions and further processed...

  16. Analysis of genetic code ambiguity arising from nematode-specific misacylated tRNAs.

    Directory of Open Access Journals (Sweden)

    Kiyofumi Hamashima

    Full Text Available The faithful translation of the genetic code requires the highly accurate aminoacylation of transfer RNAs (tRNAs. However, it has been shown that nematode-specific V-arm-containing tRNAs (nev-tRNAs are misacylated with leucine in vitro in a manner that transgresses the genetic code. nev-tRNA(Gly (CCC and nev-tRNA(Ile (UAU, which are the major nev-tRNA isotypes, could theoretically decode the glycine (GGG codon and isoleucine (AUA codon as leucine, causing GGG and AUA codon ambiguity in nematode cells. To test this hypothesis, we investigated the functionality of nev-tRNAs and their impact on the proteome of Caenorhabditis elegans. Analysis of the nucleotide sequences in the 3' end regions of the nev-tRNAs showed that they had matured correctly, with the addition of CCA, which is a crucial posttranscriptional modification required for tRNA aminoacylation. The nuclear export of nev-tRNAs was confirmed with an analysis of their subcellular localization. These results show that nev-tRNAs are processed to their mature forms like common tRNAs and are available for translation. However, a whole-cell proteome analysis found no detectable level of nev-tRNA-induced mistranslation in C. elegans cells, suggesting that the genetic code is not ambiguous, at least under normal growth conditions. Our findings indicate that the translational fidelity of the nematode genetic code is strictly maintained, contrary to our expectations, although deviant tRNAs with misacylation properties are highly conserved in the nematode genome.

  17. The computer code SEURBNUK/EURDYN (release 1). Input and output specifications

    International Nuclear Information System (INIS)

    Smith, B.L.; Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK-2 is a two-dimensional, axisymmetric, Eulerian, finite difference containment code developed initially by AWRE Aldermaston, AEE Winfrith and JRC-Ispra, and more recently by AEEW, JRC and EIR Wuerenlingen. The numerical procedure adopted in SEURBNUK to solve the hydrodynamic equations is based on the semi-implicit ICE method which itself is an extension of the MAC algorithm. SEURBNUK has a finite difference thin shell treatment for vessels and internal structures of arbitrary shape and includes the effects of the compressibility of the fluid. Fluid flow through porous media and porous structures can also be accommodated. SEURBNUK/EURDYN is an extension of SEURBNUK-2 in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin or thick structures. This has been achieved by coupling the finite element code EURDYN with SEURBNUK-2, allowing the use of conical shell elements and axisymmetric triangular elements. Within the code, the equations of motion for the structures are solved quite separately from those for the fluid, and the timestep for the fluid can be an integer multiple of that for the structures. The interaction of the structures with the fluid is then considered as a modification to the coefficients in the pressure equations, the modifications naturally depending on the behaviour of the structures within the fluid cell. The code is limited to dealing with a single fluid, the coolant, and the bubble and the cover gas are treated as cavities of uniform pressure calculated via appropriate pressure-volume-energy relationships. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations. After explaining the output facilities information is included to aid users to avoid some common pit-falls. (author)

  18. Correlated sampling added to the specific purpose Monte Carlo code McPNL for neutron lifetime log responses

    International Nuclear Information System (INIS)

    Mickael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    The specific purpose neutron lifetime oil well logging simulation code, McPNL, has been rewritten for greater user-friendliness and faster execution. Correlated sampling has been added to the code to enable studies of relative changes in the tool response caused by environmental changes. The absolute responses calculated by the code have been benchmarked against laboratory test pit data. The relative responses from correlated sampling are not directly benchmarked, but they are validated using experimental and theoretical results

  19. The Twofold Multidimensionality of Academic Self-Concept: Domain Specificity and Separation between Competence and Affect Components

    Science.gov (United States)

    Arens, A. Katrin; Yeung, Alexander Seeshing; Craven, Rhonda G.; Hasselhorn, Marcus

    2011-01-01

    Academic self-concept is consistently proven to be multidimensional rather than unidimensional as it is domain specific in nature. However, each specific self-concept domain may be further separated into competence and affect components. This study examines the twofold multidimensionality of academic self-concept (i.e., its domain specificity and…

  20. The PP1 binding code: a molecular-lego strategy that governs specificity.

    Science.gov (United States)

    Heroes, Ewald; Lesage, Bart; Görnemann, Janina; Beullens, Monique; Van Meervelt, Luc; Bollen, Mathieu

    2013-01-01

    Ser/Thr protein phosphatase 1 (PP1) is a single-domain hub protein with nearly 200 validated interactors in vertebrates. PP1-interacting proteins (PIPs) are ubiquitously expressed but show an exceptional diversity in brain, testis and white blood cells. The binding of PIPs is mainly mediated by short motifs that dock to surface grooves of PP1. Although PIPs often contain variants of the same PP1 binding motifs, they differ in the number and combination of docking sites. This molecular-lego strategy for binding to PP1 creates holoenzymes with unique properties. The PP1 binding code can be described as specific, universal, degenerate, nonexclusive and dynamic. PIPs control associated PP1 by interference with substrate recruitment or access to the active site. In addition, some PIPs have a subcellular targeting domain that promotes dephosphorylation by increasing the local concentration of PP1. The diversity of the PP1 interactome and the properties of the PP1 binding code account for the exquisite specificity of PP1 in vivo. © 2012 The Authors Journal compilation © 2012 FEBS.

  1. VERBAL REPRESENTATION OF THE CONCEPT OF POLITENESS AND ITS SPECIFICS IN RUSSIAN AND MODERN HEBREW

    Directory of Open Access Journals (Sweden)

    Pishchalnikova Vera Anatolyevna

    2015-03-01

    Full Text Available The authors compare the content of the universal ethic notion of politeness in Russian and Modern Hebrew on the basis of structural research of word's lexical meaning. Politeness is investigated as a basic ethic value that is included into the nucleus of ethnic culture and has certain influence on the content of ethnic worldview. The verbal representation of politeness as an ethic category differs significantly revealing the different structure and specific correlation of semantic components. The authors emphasize that national and cultural specifics of communicative behavior are defined by the different content of the concept of politeness in the communicative consciousness of the representatives of different linguocultures. In the process of communication the members of an ethnos act according to different operational understanding of the politeness assimilated during the social adaptation process. The authors underline that the notions (concepts described by the same word often differ in psychological meaning within the conceptual spheres of different nations. These different parts of ethnosspecific concepts contain significant information and national specifics of communicative behavior, and that is why they are pragmatically relevant. The research of the specifics of politeness concept meaning is a theoretical and pragmatic topic of current importance. The authors describe the first stage of politeness concept modelling based on lexicographic data. By comparing the components of concepts' meaning, that are denominated by comparable words in Russian and Jewish linguocultures, the authors prove the stability of concept's structure as a specific mental unit. The structural and semantic models of ъебйга and вежливость contain discreet components of meaning that are opposed gradually. The authors believe that this could be the indication of certain universal types of relations between word's meaning components.

  2. Erasmus MC at CLEF eHealth 2016: Concept recognition and coding in French texts

    NARCIS (Netherlands)

    E.M. Van Mulligen (Erik M.); Z. Afzal (Zubair); S.A. Akhondi (Saber); D. Vo (Dang); J.A. Kors (Jan)

    2016-01-01

    textabstractWe participated in task 2 of the CLEF eHealth 2016 chal-lenge. Two subtasks were addressed: entity recognition and normalization in a corpus of French drug labels and Medline titles, and ICD-10 coding of French death certificates. For both subtasks we used a dictionary-based approach.

  3. Reliability in the performance-based concept of fib Model Code 2010

    NARCIS (Netherlands)

    Bigaj-van Vliet, A.; Vrouwenvelder, T.

    2013-01-01

    The design philosophy of the new fib Model Code for Concrete Structures 2010 represents the state of the art with regard to performance-based approach to the design and assessment of concrete structures. Given the random nature of quantities determining structural behaviour, the assessment of

  4. Concept - or no concept

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1999-01-01

    Discussion about concept in industrial companies. A method for mapping of managerial concept in specific area is shown......Discussion about concept in industrial companies. A method for mapping of managerial concept in specific area is shown...

  5. Inheritance-mode specific pathogenicity prioritization (ISPP) for human protein coding genes.

    Science.gov (United States)

    Hsu, Jacob Shujui; Kwan, Johnny S H; Pan, Zhicheng; Garcia-Barcelo, Maria-Mercè; Sham, Pak Chung; Li, Miaoxin

    2016-10-15

    Exome sequencing studies have facilitated the detection of causal genetic variants in yet-unsolved Mendelian diseases. However, the identification of disease causal genes among a list of candidates in an exome sequencing study is still not fully settled, and it is often difficult to prioritize candidate genes for follow-up studies. The inheritance mode provides crucial information for understanding Mendelian diseases, but none of the existing gene prioritization tools fully utilize this information. We examined the characteristics of Mendelian disease genes under different inheritance modes. The results suggest that Mendelian disease genes with autosomal dominant (AD) inheritance mode are more haploinsufficiency and de novo mutation sensitive, whereas those autosomal recessive (AR) genes have significantly more non-synonymous variants and regulatory transcript isoforms. In addition, the X-linked (XL) Mendelian disease genes have fewer non-synonymous and synonymous variants. As a result, we derived a new scoring system for prioritizing candidate genes for Mendelian diseases according to the inheritance mode. Our scoring system assigned to each annotated protein-coding gene (N = 18 859) three pathogenic scores according to the inheritance mode (AD, AR and XL). This inheritance mode-specific framework achieved higher accuracy (area under curve  = 0.84) in XL mode. The inheritance-mode specific pathogenicity prioritization (ISPP) outperformed other well-known methods including Haploinsufficiency, Recessive, Network centrality, Genic Intolerance, Gene Damage Index and Gene Constraint scores. This systematic study suggests that genes manifesting disease inheritance modes tend to have unique characteristics. ISPP is included in KGGSeq v1.0 (http://grass.cgs.hku.hk/limx/kggseq/), and source code is available from (https://github.com/jacobhsu35/ISPP.git). mxli@hku.hkSupplementary information: Supplementary data are available at Bioinformatics online. © The Author

  6. Conceptions of the Nature of Science--Are They General or Context Specific?

    Science.gov (United States)

    Urhahne, Detlef; Kremer, Kerstin; Mayer, Juergen

    2011-01-01

    The study investigates the relationship between general and context-specific conceptions of the nature of science (NOS). The categorization scheme by Osborne et al. (J Res Sci Teach 40:692-720, "2003") served as the theoretical framework of the study. In the category "nature of scientific knowledge", the certainty, development, simplicity,…

  7. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    /validation of MS/MP capabilities because these advanced tools have not yet reached sufficient maturity to support such an investigation. In an earlier paper (Unal et al., 2011), we proposed a methodology that potentially can be used to address these new challenges in the design and licensing of evolving nuclear technology. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept was introduced and is accomplished through data assimilation. Since advanced MS/MP codes have not yet reached the level of maturity required for a comprehensive validation and calibration exercise, we considered two legacy fuel codes and apply parts of our methodology to these codes to demonstrate the benefits of the new calibration capabilities we recently developed as a part of the proposed framework. This effort does not directly support “born-assessed” validation for advanced MS/MP codes, but is useful to gain insight on legacy modeling deficiencies and to guide and develop recommendations on high and low priority directions for development of advanced codes and advanced experiments, so as to maximize the benefits of advanced validation and uncertainty quantification (VU) efforts involving the next generation of MS/MP code capabilities. This paper discusses the application of advanced validation techniques (sensitivity, calibration, and prediction) to nuclear fuel performance codes FRAPCON (Geelhood et al., 2011a,b) and LIFE-4 (Boltax et al., 1990). FRAPCON is used to predict oxide fuel behavior in light water reactors. LIFE-4 was developed in the 1980s to predict oxide fuel behavior in fast reactors. We introduce a sensitivity ranking methodology to narrow down the selected parameters for follow-up sensitivity and calibration analyses. We use screening methods with both codes and discuss the results. The number of selected modeling parameters was 61 for FRAPCON and 69 for LIFE-4. The screening

  8. A GFR benchmark comparison of transient analysis codes based on the ETDR concept

    International Nuclear Information System (INIS)

    Bubelis, E.; Coddington, P.; Castelliti, D.; Dor, I.; Fouillet, C.; Geus, E. de; Marshall, T.D.; Van Rooijen, W.; Schikorr, M.; Stainsby, R.

    2007-01-01

    A GFR (Gas-cooled Fast Reactor) transient benchmark study was performed to investigate the ability of different code systems to calculate the transition in the core heat removal from the main circuit forced flow to natural circulation cooling using the Decay Heat Removal (DHR) system. This benchmark is based on a main blower failure in the Experimental Technology Demonstration Reactor (ETDR) with reactor scram. The codes taking part into the benchmark are: RELAP5, TRAC/AAA, CATHARE, SIM-ADS, MANTA and SPECTRA. For comparison purposes the benchmark was divided into several stages: the initial steady-state solution, the main blower flow run-down, the opening of the DHR loop and the transition to natural circulation and finally the 'quasi' steady heat removal from the core by the DHR system. The results submitted by the participants showed that all the codes gave consistent results for all four stages of the benchmark. In the steady-state the calculations revealed some differences in the clad and fuel temperatures, the core and main loop pressure drops and in the total Helium mass inventory. Also some disagreements were observed in the Helium and water flow rates in the DHR loop during the final natural circulation stage. Good agreement was observed for the total main blower flow rate and Helium temperature rise in the core, as well as for the Helium inlet temperature into the core. In order to understand the reason for the differences in the initial 'blind' calculations a second round of calculations was performed using a more precise set of boundary conditions

  9. Geometrical modification transfer between specific meshes of each coupled physical codes. Application to the Jules Horowitz research reactor experimental devices

    International Nuclear Information System (INIS)

    Duplex, B.

    2011-01-01

    The CEA develops and uses scientific software, called physical codes, in various physical disciplines to optimize installation and experimentation costs. During a study, several physical phenomena interact, so a code coupling and some data exchanges between different physical codes are required. Each physical code computes on a particular geometry, usually represented by a mesh composed of thousands to millions of elements. This PhD Thesis focuses on the geometrical modification transfer between specific meshes of each coupled physical code. First, it presents a physical code coupling method where deformations are computed by one of these codes. Next, it discusses the establishment of a model, common to different physical codes, grouping all the shared data. Finally, it covers the deformation transfers between meshes of the same geometry or adjacent geometries. Geometrical modifications are discrete data because they are based on a mesh. In order to permit every code to access deformations and to transfer them, a continuous representation is computed. Two functions are developed, one with a global support, and the other with a local support. Both functions combine a simplification method and a radial basis function network. A whole use case is dedicated to the Jules Horowitz reactor. The effect of differential dilatations on experimental device cooling is studied. (author) [fr

  10. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    International Nuclear Information System (INIS)

    Fletcher, C.D.

    1986-01-01

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes. 6 refs., 17 figs., 1 tab

  11. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  12. Evidence for gene-specific rather than transcription rate-dependent histone H3 exchange in yeast coding regions.

    Science.gov (United States)

    Gat-Viks, Irit; Vingron, Martin

    2009-02-01

    In eukaryotic organisms, histones are dynamically exchanged independently of DNA replication. Recent reports show that different coding regions differ in their amount of replication-independent histone H3 exchange. The current paradigm is that this histone exchange variability among coding regions is a consequence of transcription rate. Here we put forward the idea that this variability might be also modulated in a gene-specific manner independently of transcription rate. To that end, we study transcription rate-independent replication-independent coding region histone H3 exchange. We term such events relative exchange. Our genome-wide analysis shows conclusively that in yeast, relative exchange is a novel consistent feature of coding regions. Outside of replication, each coding region has a characteristic pattern of histone H3 exchange that is either higher or lower than what was expected by its RNAPII transcription rate alone. Histone H3 exchange in coding regions might be a way to add or remove certain histone modifications that are important for transcription elongation. Therefore, our results that gene-specific coding region histone H3 exchange is decoupled from transcription rate might hint at a new epigenetic mechanism of transcription regulation.

  13. Conception and development of an adaptive energy mesher for multigroup library generation of the transport codes

    International Nuclear Information System (INIS)

    Mosca, P.

    2009-12-01

    The deterministic transport codes solve the stationary Boltzmann equation in a discretized energy formalism called multigroup. The transformation of continuous data in a multigroup form is obtained by averaging the highly variable cross sections of the resonant isotopes with the solution of the self-shielding models and the remaining ones with the coarse energy spectrum of the reactor type. So far the error of such an approach could only be evaluated retrospectively. To remedy this, we studied in this thesis a set of methods to control a priori the accuracy and the cost of the multigroup transport computation. The energy mesh optimisation is achieved using a two step process: the creation of a reference mesh and its optimized condensation. In the first stage, by refining locally and globally the energy mesh, we seek, on a fine energy mesh with subgroup self-shielding, a solution equivalent to a reference solver (Monte Carlo or pointwise deterministic solver). In the second step, once fixed the number of groups, depending on the acceptable computational cost, and chosen the most appropriate self-shielding models to the reactor type, we look for the best bounds of the reference mesh minimizing reaction rate errors by the particle swarm optimization algorithm. This new approach allows us to define new meshes for fast reactors as accurate as the currently used ones, but with fewer groups. (author)

  14. Cerebellum-specific and age-dependent expression of an endogenous retrovirus with intact coding potential

    Directory of Open Access Journals (Sweden)

    Itoh Takayuki

    2011-10-01

    Full Text Available Abstract Background Endogenous retroviruses (ERVs, including murine leukemia virus (MuLV type-ERVs (MuLV-ERVs, are presumed to occupy ~10% of the mouse genome. In this study, following the identification of a full-length MuLV-ERV by in silico survey of the C57BL/6J mouse genome, its distribution in different mouse strains and expression characteristics were investigated. Results Application of a set of ERV mining protocols identified a MuLV-ERV locus with full coding potential on chromosome 8 (named ERVmch8. It appears that ERVmch8 shares the same genomic locus with a replication-incompetent MuLV-ERV, called Emv2; however, it was not confirmed due to a lack of relevant annotation and Emv2 sequence information. The ERVmch8 sequence was more prevalent in laboratory strains compared to wild-derived strains. Among 16 different tissues of ~12 week-old female C57BL/6J mice, brain homogenate was the only tissue with evident expression of ERVmch8. Further ERVmch8 expression analysis in six different brain compartments and four peripheral neuronal tissues of C57BL/6J mice revealed no significant expression except for the cerebellum in which the ERVmch8 locus' low methylation status was unique compared to the other brain compartments. The ERVmch8 locus was found to be surrounded by genes associated with neuronal development and/or inflammation. Interestingly, cerebellum-specific ERVmch8 expression was age-dependent with almost no expression at 2 weeks and a plateau at 6 weeks. Conclusions The ecotropic ERVmch8 locus on the C57BL/6J mouse genome was relatively undermethylated in the cerebellum, and its expression was cerebellum-specific and age-dependent.

  15. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  16. Lightweight Detection of Android-specific Code Smells : The aDoctor Project

    NARCIS (Netherlands)

    Palomba, F.; Di Nucci, D.; Panichella, A.; Zaidman, A.E.; De Lucia, Andrea; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Code smells are symptoms of poor design solutions applied by programmers during the development of software systems. While the research community devoted a lot of effort to studying and devising approaches for detecting the traditional code smells defined by Fowler, little knowledge and support

  17. Technical Specifications of Structural Health Monitoring for Highway Bridges: New Chinese Structural Health Monitoring Code

    Directory of Open Access Journals (Sweden)

    Fernando Moreu

    2018-03-01

    Full Text Available Governments and professional groups related to civil engineering write and publish standards and codes to protect the safety of critical infrastructure. In recent decades, countries have developed codes and standards for structural health monitoring (SHM. During this same period, rapid growth in the Chinese economy has led to massive development of civil engineering infrastructure design and construction projects. In 2016, the Ministry of Transportation of the People’s Republic of China published a new design code for SHM systems for large highway bridges. This document is the first technical SHM code by a national government that enforces sensor installation on highway bridges. This paper summarizes the existing international technical SHM codes for various countries and compares them with the new SHM code required by the Chinese Ministry of Transportation. This paper outlines the contents of the new Chinese SHM code and explains its relevance for the safety and management of large bridges in China, introducing key definitions of the Chinese–United States SHM vocabulary and their technical significance. Finally, this paper discusses the implications for the design and implementation of a future SHM codes, with suggestions for similar efforts in United States and other countries.

  18. The angiosome concept in clinical practice: implications for patient-specific recanalization procedures.

    Science.gov (United States)

    Brodmann, M

    2013-10-01

    Below-the-knee (BTK) disease with the clinical presentation of critical limb ischemia is associated with a high rate of limb loss due to minor and major amputations. The main problem is to find a way to optimize blood flow to the critical limb area. BTK joint the down-stream diverges into 3 arms which supply different areas. Different concepts exist how optimal blood flow to the critical ischemic areas BTK can be achieved, either by treating as many vessels as can be reopened by an endovascular procedure, or by going for the two main BTK vessels, or in an outstanding situation also to treat the inflow of collaterals to achieve as much blood flow down the foot as possible. Derived from plastic surgery for the purpose of healing of flaps, the angiosome concept has been developed. An angiosome is an anatomic unit of tissue (consisting of skin, subcutaneous tissue, fascia, muscle and bone) fed by a source artery and drained by specific veins. From that point of view it can be presumed that revascularization of the source artery to the angiosome might result in better wound healing and limb salvage rates. The angiosome treatment concept of BTK disease refers to the concept in cardiology, where discrimination of reversible ischemia areas is made and respective vessels leading to these areas are treated in a distinctive way.

  19. Primate-specific spliced PMCHL RNAs are non-protein coding in human and macaque tissues

    Directory of Open Access Journals (Sweden)

    Delerue-Audegond Audrey

    2008-12-01

    Full Text Available Abstract Background Brain-expressed genes that were created in primate lineage represent obvious candidates to investigate molecular mechanisms that contributed to neural reorganization and emergence of new behavioural functions in Homo sapiens. PMCHL1 arose from retroposition of a pro-melanin-concentrating hormone (PMCH antisense mRNA on the ancestral human chromosome 5p14 when platyrrhines and catarrhines diverged. Mutations before divergence of hylobatidae led to creation of new exons and finally PMCHL1 duplicated in an ancestor of hominids to generate PMCHL2 at the human chromosome 5q13. A complex pattern of spliced and unspliced PMCHL RNAs were found in human brain and testis. Results Several novel spliced PMCHL transcripts have been characterized in human testis and fetal brain, identifying an additional exon and novel splice sites. Sequencing of PMCHL genes in several non-human primates allowed to carry out phylogenetic analyses revealing that the initial retroposition event took place within an intron of the brain cadherin (CDH12 gene, soon after platyrrhine/catarrhine divergence, i.e. 30–35 Mya, and was concomitant with the insertion of an AluSg element. Sequence analysis of the spliced PMCHL transcripts identified only short ORFs of less than 300 bp, with low (VMCH-p8 and protein variants or no evolutionary conservation. Western blot analyses of human and macaque tissues expressing PMCHL RNA failed to reveal any protein corresponding to VMCH-p8 and protein variants encoded by spliced transcripts. Conclusion Our present results improve our knowledge of the gene structure and the evolutionary history of the primate-specific chimeric PMCHL genes. These genes produce multiple spliced transcripts, bearing short, non-conserved and apparently non-translated ORFs that may function as mRNA-like non-coding RNAs.

  20. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  1. Methodology and resources for the evaluation of the construction of knowledge about the concept of density and specific mass

    OpenAIRE

    Tânia Inácio de Oliveira; Nádia Vilela Pereira; Cláudio Boghi; Juliano Schimiguel; Dorlivete Moreira Shitsuka

    2017-01-01

    Abstract: The teaching of physics concepts involves the construction of knowledge in the students' minds. The aim of this article is to present a case report of teaching density and specific mass concepts in high school technical education classes. The study analyzes the results of the construction of methodology and development of a product so that teachers of Physics can give their students the construction of the concept of density of objects and the specific mass of the substances and per...

  2. The preliminary thermal-hydraulic design of one superheated steam water cooled blanket concept based on RELAP5 and MELCOR codes - 15147

    International Nuclear Information System (INIS)

    Guo, Y.; Wang, G.; Cheng, Y.; Peng, C.

    2015-01-01

    Water Cooled Blanket (WCB) is very important in the concept design and energy transfer in future fusion power plant. One concept design of WCB is under computational testing. RELAP5 and MELCOR codes, which are mature and often used in nuclear engineering, are selected as simulation tools. The complex inner flow channels and heat sources are simplified according to its thermal-hydraulic characteristics. Then the nodal models for RELAP5 and MELCOR are built for approximating the concept design. The superheated steam scheme is analyzed by two codes separately under different power levels. After some adjustments of the inlet flow resistance coefficients of some flow channels, the reasonable stable conditions can be obtained. The stable fluid and wall temperature distributions and pressure drops are studied. The results of two codes are compared and some advices are given. (authors)

  3. Specifications for a two-dimensional multi-group scattering code: ALCI; Specification d'un code de diffusion multigroupe a deux dimensions: ALCI

    Energy Technology Data Exchange (ETDEWEB)

    Bayard, J P; Guillou, A; Lago, B; Bureau du Colombier, M J; Guillou, G; Vasseur, Ch [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1965-02-01

    This report describes the specifications of the ALCI programme. This programme resolves the system of difference equations similar to the homogeneous problem of multigroup neutron scattering, with two dimensions in space, in the three geometries XY, RZ, R{theta}. It is possible with this method to calculate geometric and composition criticalities and also to calculate the accessory problem on demand. The maximum number of points dealt with is 6000. The maximum permissible number of groups is 12. The internal iterations are treated by the method of alternating directions. The external iterations are accelerated using the extrapolation method due to Tchebychev. (authors) [French] Ce rapport decrit les specifications du programme ALCI. Ce programme resout le systeme d'equations aux differences approchant le probleme homogene de la diffusion neutronique multigroupe, a deux dimensions d'espace, dans les trois geometries XY, RZ, R{theta}. Il permet des calculs de criticalite geometrique et de composition et calcule sur demande le probleme adjoint. Le nombre maximum de points traites est de 6000. Le nombre maximum de groupes permis est de 12. Les iterations interieure sont traitees par la methode des directions alternees. Les iterations exterieures sont accelerees par la methode d'extrapolation de Tchebychev. (auteurs)

  4. Specificity Protein (Sp) Transcription Factors and Metformin Regulate Expression of the Long Non-coding RNA HULC

    Science.gov (United States)

    There is evidence that specificity protein 1 (Sp1) transcription factor (TF) regulates expression of long non-coding RNAs (lncRNAs) in hepatocellular carcinoma (HCC) cells. RNA interference (RNAi) studies showed that among several lncRNAs expressed in HepG2, SNU-449 and SK-Hep-1...

  5. THE SPECIFIC CHARACTERISTICS OF PROMOTIONAL JOURNALISM – A COMPARATIVE ANALYSIS WITH OTHER RELATED CONCEPTS

    Directory of Open Access Journals (Sweden)

    CRINA ANIŞOARA TRIFAN (LICA

    2013-05-01

    Full Text Available Purpose statement – This paper’s purpose is to contribute to the development of a specific know-how through the establishment of a theoretical framework of reference. This can facilitate the research steps which follow the identification; analysis; and interpretation of a new marketing instrument – promotional journalism. The objectives, of this study, relate not only to the establishment of theoretical but, also, to the practical characteristics, of promotional journalism. These are based on a comparative analysis with other related concepts and, also, on a qualitative analysis of the contents of the specific materials. Design – The research problem imposes a methodological interdisciplinary approach; this enables the identification; systematization; analysis; and theoretical interpretation of the fundamental concepts, theories and ideas, from the specialized literature, to be orientated towards studies and articles from separate fields. From the perspective of the research objectives and this interdisciplinary study of specilized literature, there was added a qualitative analysis of the content of the materials specific to promotional journalism found between 2002 and 2006, in the fashion magazine, Elle Romania. Overview - The specialized literature presented key-concepts; different terminologies; and meanings apparently for the same studied “reality“. This made the achievement of the process of conceptual delimitations even more difficult to the extent that promotional journalism was situated at the intersection of various sciences and the acknowledgment, of its related terms, were either similar or stated vaguely. Originality – This paper’s originality stems from the interdisciplinary perspective to the approach to the problematical aspect and the analysis and interpretitive complexity of the research results. These might prove useful both to the accomplishment of the studies in the field and for the practitioners and

  6. Negative cancer stereotypes and disease-specific self-concept in head and neck cancer.

    Science.gov (United States)

    Wong, Janice C; Payne, Ada Y M; Mah, Kenneth; Lebel, Sophie; Lee, Ruth N F; Irish, Jonathan; Rodin, Gary; Devins, Gerald M

    2013-05-01

    Life-threatening diseases, such as head and neck cancer (HNCa), can stimulate the emergence of a new disease-specific self-concept. We hypothesized that (i) negative cancer-stereotypes invoke distancing, which inhibits the adoption of a disease-specific self-concept and (ii) patient characteristics, disease and treatment factors, and cancer-related stressors moderate the phenomenon. Head and neck cancer outpatients (N = 522) completed a semantic-differential measure of disease-specific self-concept (perceived similarity to the 'cancer patient') and other self-report measures in structured interviews. Negative cancer-stereotypes were represented by the number of semantic-differential dimensions (0-3) along which respondents evaluated the stereotypic 'cancer patient' negatively (i.e., negative valence). We tested the two-way interactions between negative valence and hypothesized moderator variables. We observed significant negative valence × moderator interactions for the following: (i) patient characteristics (education, employment, social networks); (ii) disease and treatment factors (cancer-symptom burden); and (iii) cancer-related stressors (uncertainty, lack of information, and existential threats). Negative cancer stereotypes were consistently associated with distancing of self from the stereotypic 'cancer patient,' but the effect varied across moderator variables. All significant moderators (except employment and social networks) were associated with increasing perceived similarity to the 'cancer patient' when respondents maintained negative stereotypes; perceived similarity decreased when people were employed or had extensive social networks. Moderator effects were less pronounced when respondents did not endorse negative cancer stereotypes. When they hold negative stereotypes, people with HNCa distance themselves from a 'cancer patient' identity to preserve self-esteem or social status, but exposure to cancer-related stressors and adaptive demands may

  7. THE CONCEPT “LONDON” AS A TEMPORAL CODE OF LINGUOCULTURE IN THE LITERARY AND REGIONAL WORK OF PETER ACKROYD “LONDON: THE BIOGRAPHY”

    OpenAIRE

    Kaliev, Sultan; Zhumagulova, Batima

    2018-01-01

    This article analyzes the spatial-temporal code oflingua-culture as one of the components of the general cognitive-matrix modelof the structure of the concept "London" in the literary and regionalwork of Peter Ackroyd "London: The Biography". This approachimplements integration of cognitive-matrix modeling of the structure of theconcept and the system of codes of lingua-culture (anthropomorphic,temporal, vegetative, spiritual, social, chemical, etc.) The space-timecode of ...

  8. Formulation of Policy for Cyber Crime in Criminal Law Revision Concept of Bill Book of Criminal Law (A New Penal Code)

    Science.gov (United States)

    Soponyono, Eko; Deva Bernadhi, Brav

    2017-04-01

    Development of national legal systems is aimed to establish the public welfare and the protection of the public. Many attempts has been carried out to renew material criminal law and those efforts results in the formulation of the concept of the draft Law Book of the Law of Criminal Law in the form of concept criminal code draft. The basic ideas in drafting rules and regulation based on the values inside the idology of Pancasila are balance among various norm and rules in society. The design concept of the New Criminal Code Act is anticipatory and proactive to formulate provisions on Crime in Cyberspace and Crime on Information and Electronic Transactions. Several issues compiled in this paper are whether the policy in formulation of cyber crime is embodied in the provisions of the current legislation and what the policies formulation of cyber crime is in the concept of the bill book of law - criminal law recently?.

  9. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  10. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  11. Specific structural probing of plasmid-coded ribosomal RNAs from Escherichia coli

    DEFF Research Database (Denmark)

    Aagaard, C; Rosendahl, G; Dam, M

    1991-01-01

    The preferred method for construction and in vivo expression of mutagenised Escherichia coli ribosomal RNAs (rRNAs) is via high copy number plasmids. Transcription of wild-type rRNA from the seven chromosomal rrn operons in strains harbouring plasmid-coded mutant rRNAs leads to a heterogeneous...

  12. Self-concept of children with cerebral palsy measured using the population-specific myTREEHOUSE Self-Concept Assessment.

    Science.gov (United States)

    Cheong, Sau Kuan; Lang, Cathryne P; Johnston, Leanne M

    2018-02-01

    Self-concept is an individual's perception of him/herself. Research into the self-concept of children with cerebral palsy (CP) has been sparse due to the lack of a population-specific self-concept instrument. Using the new myTREEHOUSE Self-Concept Assessment, this study investigated the self-concept of children with CP in relation to age, gender, motor, communication and cognitive function. Children with CP aged 8-12 years (n = 50; 29 males; mean 10 years 2 months; GMFCS-E&R I = 36, II = 8, III = 5, IV = 1) completed myTREEHOUSE and a standardised intelligence measure. Most children reported positive self-concept from all three myTREEHOUSE Performance Perspectives and over half (60%) fell within the Low range for the Personal Concern Score. Self-concept was not associated with age, gender, motor function, or communication function. However, for cognitive function, associations were observed for Social Skills (Below Average > Average cognitive function; Cohen's d = 1.07) and Learning Skills (Above Average > Average cognitive function; Cohen's d = 0.95) domains when rated from a Personal Performance Perspective. As the first study of the self-concept of children with CP using a CP-specific assessment, this study offers important insights into what children with CP think about themselves. Generally, the self-concept of children with CP was sound. Future research on environmental facilitators and barriers to robust self-concept development is recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    Science.gov (United States)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  14. Three-dimensional numerical investigation of a Molten Salt reactor concept with the code CFX-5.5

    International Nuclear Information System (INIS)

    Yamaji, B.; Csom, G.; Aszodi, A.

    2002-01-01

    Partitioning and transmutation of actinides and long-lived fission products is a promising option to extend the possibilities and enhance the environmentally acceptable capabilities of nuclear energy. Also the possible implementation of the thorium cycle is considered as a way to reduce the problem of energy resources in the future. For both objectives different molten salt reactor concepts were proposed mainly based on the Molten Salt Reactor Experiment of the Oak Ridge National Laboratory. Not only critical reactors but also accelerator-driven subcritical systems (ADSs) have advantages worth considering for those aims, especially those ones with liquid fuel, such as molten salts. By using liquid fuel which is the coolant medium, too, a basically different thermalhydraulic behavior is expected than in the case of solid fuel and water coolant. In this work our purpose is to present the possible use of Computational Fluid Dynamics (CFD) technology in molten salt thermal hydraulics. The simulations were performed with the three-dimensional code CFX-5.5.(author)

  15. Developmental Change and Time-Specific Variation in Global and Specific Aspects of Self-Concept in Adolescence and Association with Depressive Symptoms

    Science.gov (United States)

    Kuzucu, Yasar; Bontempo, Daniel E.; Hofer, Scott M.; Stallings, Michael C.; Piccinin, Andrea M.

    2014-01-01

    Previous research has demonstrated that adolescents make differential self-evaluations in multiple domains that include physical appearance, academic competence, and peer acceptance. We report growth curve analyses over a seven year period from age 9 to age 16 on the six domains of the Harter Self-Perception Profile for Children. In general, we find little change in self-concept, on average, but do find substantial individual differences in level, rate of change, and time-specific variation in these self- evaluations. The results suggest that sex differences and adoptive status were related to only certain aspects of the participants’ self-concept. Depressive symptoms were found to have significant effects on individual differences in rate of change and on time-specific variation in general self-concept, as well as on some of the specific domains of self-concept. PMID:25143664

  16. The Context-Specific Conceptions of Learning in Case-Based Accounting Assignments, Students' Characteristics and Performance

    Science.gov (United States)

    Moilanen, Sinikka

    2017-01-01

    The present study contributes to accounting education literature by describing context-specific conceptions of learning related to case assignments, and by exploring the associations between the conceptions of learning, students' characteristics and performance. The data analysed consist of 1320 learning diaries of 336 students, connected with…

  17. Relations between Young Students' Strategic Behaviours, Domain-Specific Self-Concept, and Performance in a Problem-Solving Situation

    Science.gov (United States)

    Dermitzaki, Irini; Leondari, Angeliki; Goudas, Marios

    2009-01-01

    This study aimed at investigating the relations between students' strategic behaviour during problem solving, task performance and domain-specific self-concept. A total of 167 first- and second-graders were individually examined in tasks involving cubes assembly and in academic self-concept in mathematics. Students' cognitive, metacognitive, and…

  18. A discussion of higher order software concepts as they apply to functional requirements and specifications. [space shuttles and guidance

    Science.gov (United States)

    Hamilton, M.

    1973-01-01

    The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.

  19. Impact of constraints and rules of user-involvement methods for IS concept creation and specification

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Ohno, Takehiko; Nakatani, Momoko

    2015-01-01

    ideas. In this paper, by exemplifying our user-involvement method with game elements, ICT Service Design Game, in comparison with conventional brainstorming, we show the impact of constraints and rules in user-involvement methods when creating service concepts and specifications for information systems....... The analysis is based on a comparative experiment on two design methods and shows that the constraints and rules of our game approach fostered innovative idea generation in spite of participants’ limited knowledge of and experience with design processes. Although our analysis is still in a preliminary stage......, it indicates some positive impact of constraints and rules in design methods, especially when the methods are used by non-design professionals....

  20. Key concepts in glioblastoma therapy

    DEFF Research Database (Denmark)

    Bartek, Jiri; Ng, Kimberly; Bartek, Jiri

    2012-01-01

    principles that drive the formulation of therapeutic strategies in glioblastoma. Specifically, the concepts of tumour heterogeneity, oncogene addiction, non-oncogene addiction, tumour initiating cells, tumour microenvironment, non-coding sequences and DNA damage response will be reviewed....

  1. Effects of pathogen-specific clinical mastitis on probability of conception in Holstein dairy cows.

    Science.gov (United States)

    Hertl, J A; Schukken, Y H; Welcome, F L; Tauer, L W; Gröhn, Y T

    2014-11-01

    The objective of this study was to estimate the effects of pathogen-specific clinical mastitis (CM), occurring in different weekly intervals before or after artificial insemination (AI), on the probability of conception in Holstein cows. Clinical mastitis occurring in weekly intervals from 6 wk before until 6 wk after AI was modeled. The first 4 AI in a cow's lactation were included. The following categories of pathogens were studied: Streptococcus spp. (comprising Streptococcus dysgalactiae, Streptococcus uberis, and other Streptococcus spp.); Staphylococcus aureus; coagulase-negative staphylococci (CNS); Escherichia coli; Klebsiella spp.; cases with CM signs but no bacterial growth (above the level that can be detected from our microbiological procedures) observed in the culture sample and cases with contamination (≥ 3 pathogens in the sample); and other pathogens [including Citrobacter, yeasts, Trueperella pyogenes, gram-negative bacilli (i.e., gram-negative organisms other than E. coli, Klebsiella spp., Enterobacter, and Citrobacter), Corynebacterium bovis, Corynebacterium spp., Pasteurella, Enterococcus, Pseudomonas, Mycoplasma, Prototheca, and others]. Other factors included in the model were parity (1, 2, 3, 4 and higher), season of AI (winter, spring, summer, autumn), day in lactation of first AI, farm, and other non-CM diseases (retained placenta, metritis, ketosis, displaced abomasum). Data from 90,271 AI in 39,361 lactations in 20,328 cows collected from 2003/2004 to 2011 from 5 New York State dairy farms were analyzed in a generalized linear mixed model with a Poisson distribution. The largest reductions in probability of conception were associated with CM occurring in the week before AI or in the 2 wk following AI. Escherichia coli and Klebsiella spp. had the greatest adverse effects on probability of conception. The probability of conception for a cow with any combination of characteristics may be calculated based on the parameter estimates. These

  2. Non coding RNA: sequence-specific guide for chromatin modification and DNA damage signaling

    Directory of Open Access Journals (Sweden)

    Sofia eFrancia

    2015-11-01

    Full Text Available Chromatin conformation shapes the environment in which our genome is transcribed into RNA. Transcription is a source of DNA damage, thus it often occurs concomitantly to DNA damage signaling. Growing amounts of evidence suggest that different types of RNAs can, independently from their protein-coding properties, directly affect chromatin conformation, transcription and splicing, as well as promote the activation of the DNA damage response (DDR and DNA repair. Therefore, transcription paradoxically functions to both threaten and safeguard genome integrity. On the other hand, DNA damage signaling is known to modulate chromatin to suppress transcription of the surrounding genetic unit. It is thus intriguing to understand how transcription can modulate DDR signaling while, in turn, DDR signaling represses transcription of chromatin around the DNA lesion. An unexpected player in this field is the RNA interference (RNAi machinery, which play roles in transcription, splicing and chromatin modulation in several organisms. Non-coding RNAs (ncRNAs and several protein factors involved in the RNAi pathway are well known master regulators of chromatin while only recent reports suggest that ncRNAs are involved in DDR signaling and homology-mediated DNA repair. Here, we discuss the experimental evidence supporting the idea that ncRNAs act at the genomic loci from which they are transcribed to modulate chromatin, DDR signaling and DNA repair.

  3. Working memory templates are maintained as feature-specific perceptual codes.

    Science.gov (United States)

    Sreenivasan, Kartik K; Sambhara, Deepak; Jha, Amishi P

    2011-07-01

    Working memory (WM) representations serve as templates that guide behavior, but the neural basis of these templates remains elusive. We tested the hypothesis that WM templates are maintained by biasing activity in sensoriperceptual neurons that code for features of items being held in memory. Neural activity was recorded using event-related potentials (ERPs) as participants viewed a series of faces and responded when a face matched a target face held in WM. Our prediction was that if activity in neurons coding for the features of the target is preferentially weighted during maintenance of the target, then ERP activity evoked by a nontarget probe face should be commensurate with the visual similarity between target and probe. Visual similarity was operationalized as the degree of overlap in visual features between target and probe. A face-sensitive ERP response was modulated by target-probe similarity. Amplitude was largest for probes that were similar to the target, and decreased monotonically as a function of decreasing target-probe similarity. These results indicate that neural activity is weighted in favor of visual features that comprise an actively held memory representation. As such, our findings support the notion that WM templates rely on neural populations involved in forming percepts of memory items.

  4. Identity-specific coding of future rewards in the human orbitofrontal cortex.

    Science.gov (United States)

    Howard, James D; Gottfried, Jay A; Tobler, Philippe N; Kahnt, Thorsten

    2015-04-21

    Nervous systems must encode information about the identity of expected outcomes to make adaptive decisions. However, the neural mechanisms underlying identity-specific value signaling remain poorly understood. By manipulating the value and identity of appetizing food odors in a pattern-based imaging paradigm of human classical conditioning, we were able to identify dissociable predictive representations of identity-specific reward in orbitofrontal cortex (OFC) and identity-general reward in ventromedial prefrontal cortex (vmPFC). Reward-related functional coupling between OFC and olfactory (piriform) cortex and between vmPFC and amygdala revealed parallel pathways that support identity-specific and -general predictive signaling. The demonstration of identity-specific value representations in OFC highlights a role for this region in model-based behavior and reveals mechanisms by which appetitive behavior can go awry.

  5. Specific long non-coding RNAs response to occupational PAHs exposure in coke oven workers

    Directory of Open Access Journals (Sweden)

    Chen Gao

    Full Text Available To explore whether the alteration of lncRNA expression is correlated with polycyclic aromatic hydrocarbons (PAHs exposure and DNA damage, we examined PAHs external and internal exposure, DNA damage and lncRNAs (HOTAIR, MALAT1, TUG1 and GAS5 expression in peripheral blood lymphocytes (PBLCs of 150 male coke oven workers and 60 non-PAHs exposure workers. We found the expression of HOTAIR, MALAT1, and TUG1 were enhanced in PBLCs of coke oven workers and positively correlated with the levels of external PAHs exposure (adjusted Ptrend < 0.001 for HOTAIR and MALAT1, adjusted Ptrend = 0.006 for TUG1. However, only HOTAIR and MALAT1 were significantly associated with the level of internal PAHs exposure (urinary 1-hydroxypyrene with adjusted β = 0.298, P = 0.024 for HOTAIR and β = 0.090, P = 0.034 for MALAT1. In addition, the degree of DNA damage was positively associated with MALAT1 and HOTAIR expression in PBLCs of all subjects (adjusted β = 0.024, P = 0.002 for HOTAIR and β = 0.007, P = 0.003 for MALAT1. Moreover, we revealed that the global histone 3 lysine 27 trimethylation (H3K27me3 modification was positively associated with the degree of genetic damage (β = 0.061, P < 0.001 and the increase of HOTAIR expression (β = 0.385, P = 0.018. Taken together, our findings suggest that altered HOTAIR and MALAT1 expression might be involved in response to PAHs-induced DNA damage. Keywords: Polycyclic aromatic hydrocarbons, Long non-coding RNA, Peripheral blood lymphocytes, DNA damage response, HOTAIR, MALAT

  6. Academic Self-Concepts in Ability Streams: Considering Domain Specificity and Same-Stream Peers

    Science.gov (United States)

    Liem, Gregory Arief D.; McInerney, Dennis M.; Yeung, Alexander S.

    2015-01-01

    The study examined the relations between academic achievement and self-concepts in a sample of 1,067 seventh-grade students from 3 core ability streams in Singapore secondary education. Although between-stream differences in achievement were large, between-stream differences in academic self-concepts were negligible. Within each stream, levels of…

  7. Specifications for a two-dimensional multi-group scattering code: ALCI

    International Nuclear Information System (INIS)

    Bayard, J.P.; Guillou, A.; Lago, B.; Bureau du Colombier, M.J.; Guillou, G.; Vasseur, Ch.

    1965-02-01

    This report describes the specifications of the ALCI programme. This programme resolves the system of difference equations similar to the homogeneous problem of multigroup neutron scattering, with two dimensions in space, in the three geometries XY, RZ, RΘ. It is possible with this method to calculate geometric and composition criticalities and also to calculate the accessory problem on demand. The maximum number of points dealt with is 6000. The maximum permissible number of groups is 12. The internal iterations are treated by the method of alternating directions. The external iterations are accelerated using the extrapolation method due to Tchebychev. (authors) [fr

  8. Satellite III non-coding RNAs show distinct and stress-specific patterns of induction

    International Nuclear Information System (INIS)

    Sengupta, Sonali; Parihar, Rashmi; Ganesh, Subramaniam

    2009-01-01

    The heat shock response in human cells is associated with the transcription of satellite III repeats (SatIII) located in the 9q12 locus. Upon induction, the SatIII transcripts remain associated with the locus and recruit several transcription and splicing factors to form the nuclear stress bodies (nSBs). The nSBs are thought to modulate epigenetic changes during the heat shock response. We demonstrate here that the nSBs are induced by a variety of stressors and show stress-specific patterns of induction. While the transcription factor HSF1 is required for the induction of SatIII locus by the stressors tested, its specific role in the transcriptional process appears to be stress dependent. Our results suggest the existence of multiple transcriptional loci for the SatIII transcripts and that their activation might depend upon the type of stressors. Thus, induction of SatIII transcripts appears to be a generic response to a variety of stress conditions.

  9. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  10. Academic Self-Concept and Achievement in Polish Primary Schools: Cross-Lagged Modelling and Gender-Specific Effects

    Science.gov (United States)

    Grygiel, Pawel; Modzelewski, Michal; Pisarek, Jolanta

    2017-01-01

    This study reports relationships between general academic self-concept and achievement in grade 3 and grade 5. Gender-specific effects were investigated using a longitudinal, two-cycle, 3-year autoregressive cross-lagged panel design in a large, representative sample of Polish primary school pupils (N = 4,226). Analysis revealed (a) reciprocal…

  11. The influence of role-specific self-concept and sex-role identity on career choices in science

    Science.gov (United States)

    Baker, Dale R.

    Despite much effort on the part of educators the number of females who choose science careers remains low. This research focuses on two factors which may be influencing females in their choice of careers. These factors are role-specific self-concept in science and self perception in terms of stereotypical masculine and feminine characteristics. In addition logical ability and mathematics and science courses were also examined as factors in career choice. Females preferring science related careers and females preferring nontraditional careers such as police, military and trades were found to have a positive role-specific self-concept and a masculine perception of themselves. Females preferring traditional careers such as teacher or hairdresser had a poor role-specific self-concept and a more feminine perception of themselves. Males as a group were found to have a more positive role-specific self-concept than females. Logical ability was also related to a science career preference for both males and females. Males expected to take more higher level math courses than females, while females preferring science careers expected to take the most higher level science courses.

  12. Modern concepts of cost accounting: A review of the ABC method specific features

    Directory of Open Access Journals (Sweden)

    Trklja Radmila

    2014-01-01

    Full Text Available New business conditions, in which the presence of turbulent changes in the environment are extremely obvious, demand, much more than before, relevant and reliable information which represent an essential support for the management in all the stages of decision making processes. In the countries with developed market and competitive economies, new approaches, philosophies, concepts and techniques in the field of expense accounting appear. The development of high technology businesses and the appearance of business globalisation raise the question of the quality of accounting information obtained using traditional methods of cost accounting and it is necessary to change the concept of establishing product costs. According to this, management accounting should ensure an informational support for managing businesses which are based on customers' demands, internal processes, continuous business improvement etc. It is only possible with the application of modern concepts of cost accounting, which will ensure efficient of cost management and business management in modern business conditions.

  13. Learning Concepts, Language, and Literacy in Hybrid Linguistic Codes: The Multilingual Maze of Urban Grade 1 Classrooms in South Africa

    Science.gov (United States)

    Henning, Elizabeth

    2012-01-01

    From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…

  14. Benchmarking Reactor Systems Studies by Comparison of EU and Japanese System Code Results for Different DEMO Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Kemp, R.; Ward, D.J., E-mail: richard.kemp@ccfe.ac.uk [EURATOM/CCFE Association, Culham Centre for Fusion Energy, Abingdon (United Kingdom); Nakamura, M.; Tobita, K. [Japan Atomic Energy Agency, Rokkasho (Japan); Federici, G. [EFDA Garching, Max Plank Institut fur Plasmaphysik, Garching (Germany)

    2012-09-15

    Full text: Recent systems studies work within the Broader Approach framework has focussed on benchmarking the EU systems code PROCESS against the Japanese code TPC for conceptual DEMO designs. This paper describes benchmarking work for a conservative, pulsed DEMO and an advanced, steady-state, high-bootstrap fraction DEMO. The resulting former machine is an R{sub 0} = 10 m, a = 2.5 m, {beta}{sub N} < 2.0 device with no enhancement in energy confinement over IPB98. The latter machine is smaller (R{sub 0} = 8 m, a = 2.7 m), with {beta}{sub N} = 3.0, enhanced confinement, and high bootstrap fraction f{sub BS} = 0.8. These options were chosen to test the codes across a wide range of parameter space. While generally in good agreement, some of the code outputs differ. In particular, differences have been identified in the impurity radiation models and flux swing calculations. The global effects of these differences are described and approaches to identifying the best models, including future experiments, are discussed. Results of varying some of the assumptions underlying the modelling are also presented, demonstrating the sensitivity of the solutions to technological limitations and providing guidance for where further research could be focussed. (author)

  15. Specific model for a gas distribution analysis in the containment at Almaraz NPP using GOTHIC computer code

    International Nuclear Information System (INIS)

    García González, M.; García Jiménez, P.; Martínez Domínguez, F.

    2016-01-01

    To carry out an analysis of the distribution of gases within the containment building at the CN Almaraz site, a simulation model with the thermohydraulic GOTHIC [1] code has been used. This has been assessed with a gas control system based on passive autocatalytic recombiners (PARs). The model is used to test the effectiveness of the control systems for gases to be used in the Almaraz Nuclear Power Plant, Uits I&II (Caceres, Spain, 1,035 MW and 1,044 MW). The model must confirm the location and number of the recombiners proposed to be installed. It is an essential function of the gas control system to avoid any formation of explosive atmospheres by reducing and limiting the concentration of combustible gases during an accident, thus maintaining the integrity of the containment. The model considers severe accident scenarios with specific conditions that produce the most onerous generation of combustible gases.

  16. Anthropophagy: a singular concept to understand Brazilian culture and psychology as specific knowledge.

    Science.gov (United States)

    Ferreira, Arthur Arruda Leal

    2015-11-01

    The aim of this work is to present the singularity of the concept of anthropophagy in Brazilian culture. This article examines its use in the Modernist Movement of the 1920s and explores the possibilities it creates for thinking about Brazilian culture in nonidentitarian terms. We then use the concept of anthropophagy in a broader, practical sense to understand psychology as a kind of anthropophagical knowledge. We do so because in many ways the discipline of psychology is similar to Brazilian culture in its plurality and complexity. (c) 2015 APA, all rights reserved).

  17. Proof of Concept Coded Aperture Miniature Mass Spectrometer Using a Cycloidal Sector Mass Analyzer, a Carbon Nanotube (CNT) Field Emission Electron Ionization Source, and an Array Detector

    Science.gov (United States)

    Amsden, Jason J.; Herr, Philip J.; Landry, David M. W.; Kim, William; Vyas, Raul; Parker, Charles B.; Kirley, Matthew P.; Keil, Adam D.; Gilchrist, Kristin H.; Radauscher, Erich J.; Hall, Stephen D.; Carlson, James B.; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T.; Russell, Zachary E.; Grego, Sonia; Edwards, Steven J.; Sperline, Roger P.; Denton, M. Bonner; Stoner, Brian R.; Gehm, Michael E.; Glass, Jeffrey T.

    2018-02-01

    Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified.

  18. Physical self-concept of normal-weight and overweight adolescents: Gender specificities

    Directory of Open Access Journals (Sweden)

    Lazarević Dušanka

    2011-01-01

    Full Text Available Previous researchers have described the relation between physical self-concept and body mass in adolescents, but those relationships have not been clearly specified by gender. The purpose of this study is to explore physical self-concepts of normal-weight and over-weight Serbian adolescents with respect to gender. The sample consisted of 417 primary school students (229 boys and 188 girls with the average age 13.6 (SD=0.73 years who were divided into normal-weight and overweight groups according to body mass index. To assess the multidimensional physical self-concept, Physical Self-Description Questionnaire (PSDQ was administered. Results showed that overweight adolescents had significantly lower scores than normal-weight on all PSDQ scales except Health and Strength. Differences were greater among girls than boys. Discriminant analysis showed that the scales Body Fat, Endurance and Sports Competence best differentiated normal-weight boys from other students. Also, discriminant analysis showed that, besides the scale Body Fat, scales Flexibility, Self-Esteem, and Coordination best differentiated normal-weight girls from other students. Results indicate that for better understanding of the relationship between adolescent’s physical self-concept and body mass one must take gender into account. Results are potentially valuable for preventing overweight through physical education.

  19. Transference and countertransference: two concepts specific to psychoanalytic theory and practice.

    Science.gov (United States)

    Ladame, F

    1999-12-01

    The development of the theory of transference and countertransference from Freud to post-Freudian authors is described. It is concluded that the concepts of transference and countertransference are pertinent only within a definite psychoanalytic setting. They cannot be applied in every therapeutic situation.

  20. Divergent evolutionary rates in vertebrate and mammalian specific conserved non-coding elements (CNEs) in echolocating mammals.

    Science.gov (United States)

    Davies, Kalina T J; Tsagkogeorga, Georgia; Rossiter, Stephen J

    2014-12-19

    The majority of DNA contained within vertebrate genomes is non-coding, with a certain proportion of this thought to play regulatory roles during development. Conserved Non-coding Elements (CNEs) are an abundant group of putative regulatory sequences that are highly conserved across divergent groups and thus assumed to be under strong selective constraint. Many CNEs may contain regulatory factor binding sites, and their frequent spatial association with key developmental genes - such as those regulating sensory system development - suggests crucial roles in regulating gene expression and cellular patterning. Yet surprisingly little is known about the molecular evolution of CNEs across diverse mammalian taxa or their role in specific phenotypic adaptations. We examined 3,110 vertebrate-specific and ~82,000 mammalian-specific CNEs across 19 and 9 mammalian orders respectively, and tested for changes in the rate of evolution of CNEs located in the proximity of genes underlying the development or functioning of auditory systems. As we focused on CNEs putatively associated with genes underlying the development/functioning of auditory systems, we incorporated echolocating taxa in our dataset because of their highly specialised and derived auditory systems. Phylogenetic reconstructions of concatenated CNEs broadly recovered accepted mammal relationships despite high levels of sequence conservation. We found that CNE substitution rates were highest in rodents and lowest in primates, consistent with previous findings. Comparisons of CNE substitution rates from several genomic regions containing genes linked to auditory system development and hearing revealed differences between echolocating and non-echolocating taxa. Wider taxonomic sampling of four CNEs associated with the homeobox genes Hmx2 and Hmx3 - which are required for inner ear development - revealed family-wise variation across diverse bat species. Specifically within one family of echolocating bats that utilise

  1. Retrieval system for emplaced spent unreprocessed fuel (SURF) in salt bed depository. Baseline concept criteria specifications and mechanical failure probabilities

    International Nuclear Information System (INIS)

    Hudson, E.E.; McCleery, J.E.

    1979-05-01

    One of the integral elements of the Nuclear Waste Management Program is the material handling task of retrieving Canisters containing spent unreprocessed fuel from their emplacement in a deep geologic salt bed Depository. A study of the retrieval concept data base predicated this report. In this report, alternative concepts for the tasks are illustrated and critiqued, a baseline concept in scenario form is derived and basic retrieval subsystem specifications are presented with cyclic failure probabilities predicted. The report is based on the following assumptions: (a) during retrieval, a temporary radiation seal is placed over each Canister emplacement; (b) a sleeve, surrounding the Canister, was initially installed during the original emplacement; (c) the emplacement room's physical and environmental conditions established in this report are maintained while the task is performed

  2. Key Concepts of Real Estate Market Analysis and Valuation with Specific Application to Residential Apartment Investments.

    Science.gov (United States)

    1986-01-01

    but is intended to I r Ii m provide a basic understanding of essentilal real estate I investment concepts. I l! I I 11 I I I ! I I I~i lllm m I...3,100 Utilities - Electric 5,000 Utilities - Water 15,000 Utilities - Gas/ Oil 1,200 Total Expenses $137,700 [ Net Operating Income $273,343 59 Figure

  3. Two Concepts Of Place Competition And Specificity Of Targeting In Place Marketing

    OpenAIRE

    Kirill Rozhkov

    2013-01-01

    This paper demonstrates opportunities for the development of the place marketing theory given by pure model of local expenditures (Tiebout 1956) and concepts of the creative class (Florida 2004) and creative city (Bianchini and Landry 1995). Rethinking them in marketing terms, we then analyze their limitations and show why their re-examining can support competition analysis, targeting, and marketing policy of places. In the discussion section, main directions of theoretical research in place ...

  4. The preliminary thermal–hydraulic analysis of a water cooled blanket concept design based on RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Guanghuai; Peng, Changhong; Guo, Yun, E-mail: guoyun79@ustc.edu.cn

    2016-11-01

    Highlights: • The superheated steam and PWR schemes are analyzed by RELAP5 code. • The influence of non-uniform heating sources is include. • A supposed slow flow decrease case is discussed and the PWR scheme is better. - Abstract: Water cooled blanket (WCB) is very important in the conceptual design and energy transfer in future fusion power plant. One conceptual design of WCB is under computational testing. RELAP5 code, which is mature and often used in transient analysis in Pressurizer water reactor (PWR), is selected as the simulation tool. The complex inner flow channels and heat sources are simplified according to its thermal–hydraulic characteristics. Then the nodal model for REALP5 is built for approximating the conceptual design. Two typical operating plans, superheated steam scheme and PWR scheme, are analyzed. After some adjustments of the inlet flow resistance coefficients of some flow channels, the reasonable stable conditions of both operation plans can be obtained. The stable fluid and wall temperature distributions and pressure drops are studied. At last, a supposed slow flow decreasing is discussed under two operating conditions separately. According to present results, the superheated steam scheme still needs to be further optimized. The PWR scheme shows a very good safety feature.

  5. The preliminary thermal–hydraulic analysis of a water cooled blanket concept design based on RELAP5 code

    International Nuclear Information System (INIS)

    Wang, Guanghuai; Peng, Changhong; Guo, Yun

    2016-01-01

    Highlights: • The superheated steam and PWR schemes are analyzed by RELAP5 code. • The influence of non-uniform heating sources is include. • A supposed slow flow decrease case is discussed and the PWR scheme is better. - Abstract: Water cooled blanket (WCB) is very important in the conceptual design and energy transfer in future fusion power plant. One conceptual design of WCB is under computational testing. RELAP5 code, which is mature and often used in transient analysis in Pressurizer water reactor (PWR), is selected as the simulation tool. The complex inner flow channels and heat sources are simplified according to its thermal–hydraulic characteristics. Then the nodal model for REALP5 is built for approximating the conceptual design. Two typical operating plans, superheated steam scheme and PWR scheme, are analyzed. After some adjustments of the inlet flow resistance coefficients of some flow channels, the reasonable stable conditions of both operation plans can be obtained. The stable fluid and wall temperature distributions and pressure drops are studied. At last, a supposed slow flow decreasing is discussed under two operating conditions separately. According to present results, the superheated steam scheme still needs to be further optimized. The PWR scheme shows a very good safety feature.

  6. ALERT. Adverse late effects of cancer treatment. Vol. 1. General concepts and specific precepts

    Energy Technology Data Exchange (ETDEWEB)

    Rubin, Philip; Constine, Louis S. [Univ. Rochester Medical Center, NY (United States). Dept. of Radiation Oncology; Marks, Lawrence B. (ed.) [Univ. North Carolina and Lineberger, Comprehensive Cancer Center, Chapel Hill, NC (United States). Dept. of Radiation Oncology

    2014-09-01

    Considers in detail the general concepts and principles relevant to the adverse late effects of cancer treatment. Explains the molecular, cytologic and histopathologic events that lead to altered physiologic and metabolic functions and their clinical manifestations. Includes chapters on legal issues, economic aspects, nursing, psychological issues and quality of life. The literature on the late effects of cancer treatment is widely scattered in different journals since all major organ systems are affected and management is based on a variety of medical and surgical treatments. The aim of ALERT - Adverse Late Effects of Cancer Treatment is to offer a coherent multidisciplinary approach to the care of cancer survivors. The central paradigm is that cytotoxic multimodal therapy results in a perpetual cascade of events that affects each major organ system differently and is expressed continually over time. Essentially, radiation and chemotherapy are intense biologic modifiers that allow for cancer cure and cancer survivorship but accelerate senescence of normal tissues and increase the incidence of age-related diseases and second malignant tumors. Volume 1 of this two-volume work focuses on the general concepts and principles relevant to late effects and on the dynamic interplay of molecular, cytologic and histopathologic events that lead to altered physiologic and metabolic functions and their clinical manifestations. Chapters are also included on legal issues, economic aspects, nursing, psychological issues and quality of life.

  7. ALERT. Adverse late effects of cancer treatment. Vol. 1. General concepts and specific precepts

    International Nuclear Information System (INIS)

    Rubin, Philip; Constine, Louis S.; Marks, Lawrence B.

    2014-01-01

    Considers in detail the general concepts and principles relevant to the adverse late effects of cancer treatment. Explains the molecular, cytologic and histopathologic events that lead to altered physiologic and metabolic functions and their clinical manifestations. Includes chapters on legal issues, economic aspects, nursing, psychological issues and quality of life. The literature on the late effects of cancer treatment is widely scattered in different journals since all major organ systems are affected and management is based on a variety of medical and surgical treatments. The aim of ALERT - Adverse Late Effects of Cancer Treatment is to offer a coherent multidisciplinary approach to the care of cancer survivors. The central paradigm is that cytotoxic multimodal therapy results in a perpetual cascade of events that affects each major organ system differently and is expressed continually over time. Essentially, radiation and chemotherapy are intense biologic modifiers that allow for cancer cure and cancer survivorship but accelerate senescence of normal tissues and increase the incidence of age-related diseases and second malignant tumors. Volume 1 of this two-volume work focuses on the general concepts and principles relevant to late effects and on the dynamic interplay of molecular, cytologic and histopathologic events that lead to altered physiologic and metabolic functions and their clinical manifestations. Chapters are also included on legal issues, economic aspects, nursing, psychological issues and quality of life.

  8. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  9. An ancient neurotrophin receptor code; a single Runx/Cbfβ complex determines somatosensory neuron fate specification in zebrafish.

    Science.gov (United States)

    Gau, Philia; Curtright, Andrew; Condon, Logan; Raible, David W; Dhaka, Ajay

    2017-07-01

    In terrestrial vertebrates such as birds and mammals, neurotrophin receptor expression is considered fundamental for the specification of distinct somatosensory neuron types where TrkA, TrkB and TrkC specify nociceptors, mechanoceptors and proprioceptors/mechanoceptors, respectively. In turn, Runx transcription factors promote neuronal fate specification by regulating neurotrophin receptor and sensory receptor expression where Runx1 mediates TrkA+ nociceptor diversification while Runx3 promotes a TrkC+ proprioceptive/mechanoceptive fate. Here, we report in zebrafish larvae that orthologs of the neurotrophin receptors in contrast to terrestrial vertebrates mark overlapping and distinct subsets of nociceptors suggesting that TrkA, TrkB and TrkC do not intrinsically promote nociceptor, mechanoceptor and proprioceptor/mechanoceptor neuronal fates, respectively. While we find that zebrafish Runx3 regulates nociceptors in contrast to terrestrial vertebrates, it shares a conserved regulatory mechanism found in terrestrial vertebrate proprioceptors/mechanoceptors in which it promotes TrkC expression and suppresses TrkB expression. We find that Cbfβ, which enhances Runx protein stability and affinity for DNA, serves as an obligate cofactor for Runx in neuronal fate determination. High levels of Runx can compensate for the loss of Cbfβ, indicating that in this context Cbfβ serves solely as a signal amplifier of Runx activity. Our data suggests an alteration/expansion of the neurotrophin receptor code of sensory neurons between larval teleost fish and terrestrial vertebrates, while the essential roles of Runx/Cbfβ in sensory neuron cell fate determination while also expanded are conserved.

  10. Mortality from circulatory diseases by specific country of birth across six European countries: test of concept

    NARCIS (Netherlands)

    Bhopal, Raj S.; Rafnsson, Snorri B.; Agyemang, Charles; Fagot-Campagna, Anne; Giampaoli, Simona; Hammar, Niklas; Harding, Seeromanie; Hedlund, Ebba; Juel, Knud; Mackenbach, Johan P.; Primatesta, Paola; Rey, Gregoire; Rosato, Michael; Wild, Sarah; Kunst, Anton E.

    2012-01-01

    Background: Important differences in cardiovascular disease (CVD) mortality by country of birth have been shown within European countries. We now focus on CVD mortality by specific country of birth across European countries. Methods: For Denmark, England and Wales, France, The Netherlands, Scotland

  11. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  12. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  13. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  14. On the concepts of carrier and specific activity in nuclear chemistry, radioanalytical chemistry and radiopharmaceutical chemistry

    International Nuclear Information System (INIS)

    Bonardi, Mauro L.

    2011-01-01

    At present a IUPAC Project regarding 'Terminology, Quantities and Units concerning Production and Applications of Radionuclides in Radiopharmaceutical and Radioanalytical Chemistry' states that: 'CARRIER is a chemical species - already present in the preparation or intentionally added - which will carry a given radionuclide in its associated species through the radiochemical procedure and/or prevents the radionuclide in its associated species from undergoing non-specific processes due to its low concentration'

  15. Comparison of the calculations of the stability properties of a specific stellarator equilibrium with different MHD stability codes

    International Nuclear Information System (INIS)

    Nakamura, Y.; Matsumoto, T.; Wakatani, M.; Ichiguchi, K.; Garcia, L.; Carreras, B.A.

    1995-04-01

    A particular configuration of the LHD stellarator with an unusually flat pressure profile has been chosen to be a test case for comparison of the MHD stability property predictions of different three-dimensional and averaged codes for the purpose of code comparison and validation. In particular, two relatively localized instabilities, the fastest growing modes with toroidal mode number n = 2 and n = 3 were studied using several different codes, with the good agreement that has been found providing justification for the use of any of them for equilibria of the type considered

  16. Validation of coupled Relap5-3D code in the analysis of RBMK-1500 specific transients

    International Nuclear Information System (INIS)

    Evaldas, Bubelis; Algirdas, Kaliatka; Eugenijus, Uspuras

    2003-01-01

    This paper deals with the modelling of RBMK-1500 specific transients taking place at Ignalina NPP. These transients include: measurements of void and fast power reactivity coefficients, change of graphite cooling conditions and reactor power reduction transients. The simulation of these transients was performed using RELAP5-3D code model of RBMK-1500 reactor. At the Ignalina NPP void and fast power reactivity coefficients are measured on a regular basis and, based on the total reactor power, reactivity, control and protection system control rods positions and the main circulation circuit parameter changes during the experiments, the actual values of these reactivity coefficients are determined. Graphite temperature reactivity coefficient at the plant is determined by changing graphite cooling conditions in the reactor cavity. This type of transient is very unique and important from the gap between fuel channel and the graphite bricks model validation point of view. The measurement results, obtained during this transient, allowed to determine the thermal conductivity coefficient for this gap and to validate the graphite temperature reactivity feedback model. Reactor power reduction is a regular operation procedure during the entire lifetime of the reactor. In all cases it starts by either a scram or a power reduction signal activation by the reactor control and protection system or by an operator. The obtained calculation results demonstrate reasonable agreement with Ignalina NPP measured data. Behaviours of the separate MCC thermal-hydraulic parameters as well as physical processes are predicted reasonably well to the real processes, occurring in the primary circuit of RBMK-1500 reactor. Reasonable agreement of the measured and the calculated total reactor power change in time demonstrates the correct modelling of the neutronic processes taking place in RBMK- 1500 reactor core. And finally, the performed validation of RELAP5-3D model of Ignalina NPP RBMK-1500

  17. Pregnancy-specific anxiety, ART conception and infant temperament at 4 months post-partum.

    Science.gov (United States)

    McMahon, C A; Boivin, J; Gibson, F L; Hammarberg, K; Wynter, K; Saunders, D; Fisher, J

    2013-04-01

    Is anxiety focused on the pregnancy outcome, known to be particularly salient in women conceiving through assisted reproductive technology (ART), related to difficult infant temperament? While trait anxiety predicts infant temperament, pregnancy-focused anxiety is not associated with more difficult infant temperament. A large body of research has provided convincing evidence that fetal exposure to maternal anxiety and stress in pregnancy has adverse consequences for child neurodevelopmental, behavioural and cognitive development, and that pregnancy-specific anxiety (concerns related to the pregnancy outcome and birth) may be of particular significance. Women conceiving through ART are of particular interest in this regard. Research over more than 20 years has consistently demonstrated that while they do not differ from spontaneously conceiving (SC) women with respect to general (state and trait) anxiety, they typically report higher pregnancy-specific anxiety. While research suggests normal behavioural and developmental outcomes for children conceived through ART, there is some evidence of more unsettled infant behaviour during the first post-natal year. The longitudinal cohort design followed 562 nulliparous women over a 7-month period, during the third trimester of pregnancy and at 4 months after birth. Approximately equal numbers of nulliparous women conceiving through ART (n = 250) and spontaneously (SC: n = 262) were recruited through ART clinics and nearby hospitals in Melbourne and Sydney, Australia. Participants completed three anxiety measures (state, trait, pregnancy specific) at time 1 in the third trimester of pregnancy and a measure of infant temperament at time 2, 4 months after birth. At time 1, relevant socio-demographic, pregnancy (maternal age, smoking, alcohol, medications, medical complications) information was recorded and at time 2, information regarding childbirth (gestation, infant birthweight, mode of delivery) and post-natal (concurrent

  18. Neuronal codes for visual perception and memory.

    Science.gov (United States)

    Quian Quiroga, Rodrigo

    2016-03-01

    In this review, I describe and contrast the representation of stimuli in visual cortical areas and in the medial temporal lobe (MTL). While cortex is characterized by a distributed and implicit coding that is optimal for recognition and storage of semantic information, the MTL shows a much sparser and explicit coding of specific concepts that is ideal for episodic memory. I will describe the main characteristics of the coding in the MTL by the so-called concept cells and will then propose a model of the formation and recall of episodic memory based on partially overlapping assemblies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Concept Specifications/Prerequisites for DeepWind Deliverable D8.1

    DEFF Research Database (Denmark)

    Schmidt Paulsen, Uwe; Schløer, Signe; Larsén, Xiaoli Guo

    (NL), NREL(USA), STATOIL(N), VESTAS(DK) and NENUPHAR(F). The report discuss the design considerations for offshore wind turbines, both in general and specifically for Darrieus-type floating turbines, as is the focus of the DeepWind project. The project is considered in a North Sea environment, notably close......The work is a result of the contributions within the DeepWind project which is supported by the European Commission, Grant 256769 FP7 Energy 2010 - Future emerging technologies, and by the DeepWind beneficiaries: DTU(DK), AAU(DK), TUDELFT(NL), TUTRENTO(I), DHI(DK), SINTEF(N), MARINTEK(N), MARIN...

  20. Developmental Dynamics of General and School-Subject-Specific Components of Academic Self-Concept, Academic Interest, and Academic Anxiety.

    Science.gov (United States)

    Gogol, Katarzyna; Brunner, Martin; Preckel, Franzis; Goetz, Thomas; Martin, Romain

    2016-01-01

    The present study investigated the developmental dynamics of general and subject-specific (i.e., mathematics, French, and German) components of students' academic self-concept, anxiety, and interest. To this end, the authors integrated three lines of research: (a) hierarchical and multidimensional approaches to the conceptualization of each construct, (b) longitudinal analyses of bottom-up and top-down developmental processes across hierarchical levels, and (c) developmental processes across subjects. The data stemmed from two longitudinal large-scale samples (N = 3498 and N = 3863) of students attending Grades 7 and 9 in Luxembourgish schools. Nested-factor models were applied to represent each construct at each grade level. The analyses demonstrated that several characteristics were shared across constructs. All constructs were multidimensional in nature with respect to the different subjects, showed a hierarchical organization with a general component at the apex of the hierarchy, and had a strong separation between the subject-specific components at both grade levels. Further, all constructs showed moderate differential stabilities at both the general (0.42 < r < 0.55) and subject-specific levels (0.45 < r < 0.73). Further, little evidence was found for top-down or bottom-up developmental processes. Rather, general and subject-specific components in Grade 9 proved to be primarily a function of the corresponding components in Grade 7. Finally, change in several subject-specific components could be explained by negative effects across subjects.

  1. Developmental Dynamics of General and School-Subject-Specific Components of Academic Self-Concept, Academic Interest, and Academic Anxiety

    Directory of Open Access Journals (Sweden)

    Katarzyna eGogol

    2016-03-01

    Full Text Available The present study investigated the developmental dynamics of general and subject-specific (i.e., mathematics, French, and German components of students’ academic self-concept, anxiety, and interest. To this end, the authors integrated three lines of research: (a hierarchical and multidimensional approaches to the conceptualization of each construct, (b longitudinal analyses of bottom-up and top-down developmental processes across hierarchical levels, and (c ipsative developmental processes across subjects. The data stemmed from two longitudinal large-scale samples (N = 3,498 and N = 3,863 of students attending Grades 7 and 9 in Luxembourgish schools. Nested-factor models were applied to represent each construct at each grade level. The analyses demonstrated that several characteristics were shared across constructs. All constructs were multidimensional in nature with respect to the different subjects, showed a hierarchical organization with a general component at the apex of the hierarchy, and had a strong separation between the subject-specific components at both grade levels. Further, all constructs showed moderate differential stabilities at both the general (.42 < r < .55 and subject-specific levels (.45 < r < .73. Further, little evidence was found for top-down or bottom-up developmental processes. Rather, general and subject-specific components in Grade 9 proved to be primarily a function of the corresponding components in Grade 7. Finally, change in several subject-specific components could be explained by negative, ipsative effects across subjects.

  2. Developmental Dynamics of General and School-Subject-Specific Components of Academic Self-Concept, Academic Interest, and Academic Anxiety

    Science.gov (United States)

    Gogol, Katarzyna; Brunner, Martin; Preckel, Franzis; Goetz, Thomas; Martin, Romain

    2016-01-01

    The present study investigated the developmental dynamics of general and subject-specific (i.e., mathematics, French, and German) components of students' academic self-concept, anxiety, and interest. To this end, the authors integrated three lines of research: (a) hierarchical and multidimensional approaches to the conceptualization of each construct, (b) longitudinal analyses of bottom-up and top-down developmental processes across hierarchical levels, and (c) developmental processes across subjects. The data stemmed from two longitudinal large-scale samples (N = 3498 and N = 3863) of students attending Grades 7 and 9 in Luxembourgish schools. Nested-factor models were applied to represent each construct at each grade level. The analyses demonstrated that several characteristics were shared across constructs. All constructs were multidimensional in nature with respect to the different subjects, showed a hierarchical organization with a general component at the apex of the hierarchy, and had a strong separation between the subject-specific components at both grade levels. Further, all constructs showed moderate differential stabilities at both the general (0.42 < r < 0.55) and subject-specific levels (0.45 < r < 0.73). Further, little evidence was found for top-down or bottom-up developmental processes. Rather, general and subject-specific components in Grade 9 proved to be primarily a function of the corresponding components in Grade 7. Finally, change in several subject-specific components could be explained by negative effects across subjects. PMID:27014162

  3. Concept-guided development of classroom use of ICT : Concept-specific types of ICT use and their integration into teachers’ practices

    NARCIS (Netherlands)

    de Koster, S.

    2017-01-01

    Does a concept-guided approach in schools with either a ‘traditional’ or an ‘innovative’ educational concept contribute to the development of ICT use that becomes integrated in the teachers’ classroom practices? In order to answer this question we performed four, mainly qualitative, studies in five

  4. Friendship Predictors of Global Self-Worth and Domain-Specific Self-Concepts in University Students with and without Learning Disability

    Science.gov (United States)

    Shany, Michal; Wiener, Judith; Assido, Michal

    2013-01-01

    This study investigated the association among friendship, global self-worth, and domain-specific self-concepts in 102 university students with and without learning disabilities (LD). Students with LD reported lower global self-worth and academic self-concept than students without LD, and this difference was greater for women. Students with LD also…

  5. Development of carbon/carbon composite control rod for HTTR. 2. Concept, specifications and mechanical test of materials

    International Nuclear Information System (INIS)

    Eto, Motokuni; Ishiyama, Shintaro; Fukaya, Kiyoshi; Saito, Tamotsu; Ishihara, Masahiro; Hanawa, Satoshi.

    1998-01-01

    A concept and specifications of carbon/carbon composite (C/C) control rod were proposed, aiming at the application of the material to the HTTR. The outer diameter and length of the control rod were kept as the same as those of the present control rod, i.e., 113 mm and 3094 mm, respectively. According to the concept, the rod consists of ten units which are connected in series using bolts. Then, the stresses generated by dead loads in the control rod elements were estimated and compared with the design strengths which were derived from the results of measurements of tensile, compressive, bending and shear strengths of two candidate materials, AC250 (Across Co.) and CX-270 (Toyo Tanso Co.). Design strength was preliminarily determined as one-third or one-fifth of the mean strength. Ratio of the design strength to generated stress for the AC250 (2D) was : Tensile stress in the outer sleeve tube, 66, tensile and shear stresses in the M16 bolt, 8.8 and 8.5, shear stress in the plug support bolt M8, 2.43. These results are believed to indicate the mechanical integrity of the control rod structure. Data available on the candidate materials were also compiled in the Appendix. (author)

  6. Insights into inner ear-specific gene regulation: epigenetics and non-coding RNAs in inner ear development and regeneration

    Science.gov (United States)

    Avraham, Karen B.

    2016-01-01

    The vertebrate inner ear houses highly specialized sensory organs, tuned to detect and encode sound, head motion and gravity. Gene expression programs under the control of transcription factors orchestrate the formation and specialization of the non-sensory inner ear labyrinth and its sensory constituents. More recently, epigenetic factors and non-coding RNAs emerged as an additional layer of gene regulation, both in inner ear development and disease. In this review, we provide an overview on how epigenetic modifications and non-coding RNAs, in particular microRNAs (miRNAs), influence gene expression and summarize recent discoveries that highlight their critical role in the proper formation of the inner ear labyrinth and its sensory organs. In contrast to non-mammalian vertebrates, adult mammals lack the ability to regenerate inner ear mechano-sensory hair cells. Finally, we discuss recent insights into how epigenetic factors and miRNAs may facilitate, or in the case of mammals, restrict sensory hair cell regeneration. PMID:27836639

  7. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  8. Non-coding changes cause sex-specific wing size differences between closely related species of Nasonia

    NARCIS (Netherlands)

    Loehlin, David W.; Oliveira, Deodoro C. S. G.; Edwards, Rachel; Giebel, Jonathan D.; Clark, Michael E.; Cattani, M. Victoria; van de Zande, Louis; Verhulst, Eveline C.; Beukeboom, Leo W.; Munoz-Torres, Monica; Werren, John H.

    The genetic basis of morphological differences among species is still poorly understood. We investigated the genetic basis of sex-specific differences in wing size between two closely related species of Nasonia by positional cloning a major male-specific locus, wing-size1 (ws1). Male wing size

  9. The one-dimensional transport code CHET2, taking into account nonlinear, element-specific equilibrium sorption

    International Nuclear Information System (INIS)

    Luehrmann, L.; Noseck, U.

    1996-03-01

    While the verification report on CHET1 primarily focused on aspects such as the correctness of algorithms with respect to the modeling of advection, dispersion and diffusion, the report in hand is intended to primarily deal with nonlinear sorption and numerical sorption modeling. Another aspect discussed is the correct treatment of decay within established radioactive decay chains. First, the physical fundamentals are explained of the processes determining the radionuclide transport in the cap rock, and hence are the basis of the program discussed. The numeric algorithms the CHET2 code is based are explained, showing the details of realisation and the function of the various defaults and corrections. The iterative coupling of transport and sorption computation is illustrated by means of a program flowchart. Furthermore, the actvities for verification of the program are explained, as well as qualitative effects of computations assuming concentration-dependent sorption. The computation of the decay within decay chains is verified, and application programming using nonlinear sorption isotherms as well as the entire process of transport calculations with CHET2 are shown. (orig./DG) [de

  10. Colon specific CODES based Piroxicam tablet for colon targeting: statistical optimization, in vivo roentgenography and stability assessment.

    Science.gov (United States)

    Singh, Amit Kumar; Pathak, Kamla

    2015-03-01

    This study was aimed to statistically optimize CODES™ based Piroxicam (PXM) tablet for colon targeting. A 3(2) full factorial design was used for preparation of core tablet that was subsequently coated to get CODES™ based tablet. The experimental design of core tablets comprised of two independent variables: amount of lactulose and PEG 6000, each at three different levels and the dependent variable was %CDR at 12 h. The core tablets were evaluated for pharmacopoeial and non-pharmacopoeial test and coated with optimized levels of Eudragit E100 followed by HPMC K15 and finally with Eudragit S100. The in vitro drug release study of F1-F9 was carried out by change over media method (0.1 N HCl buffer, pH 1.2, phosphate buffer, pH 7.4 and phosphate buffer, pH 6.8 with enzyme β-galactosidase 120 IU) to select optimized formulation F9 that was subjected to in vivo roentgenography. Roentgenography study corroborated the in vitro performance, thus providing the proof of concept. The experimental design was validated by extra check point formulation and Diffuse Reflectance Spectroscopy revealed absence of any interaction between drug and formulation excipients. The shelf life of F9 was deduced as 12 months. Conclusively, colon targeted CODES™ technology based PXM tablets were successfully optimized and its potential of colon targeting was validated by roentgenography.

  11. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  12. Histone modification profiles are predictive for tissue/cell-type specific expression of both protein-coding and microRNA genes

    Directory of Open Access Journals (Sweden)

    Zhang Michael Q

    2011-05-01

    Full Text Available Abstract Background Gene expression is regulated at both the DNA sequence level and through modification of chromatin. However, the effect of chromatin on tissue/cell-type specific gene regulation (TCSR is largely unknown. In this paper, we present a method to elucidate the relationship between histone modification/variation (HMV and TCSR. Results A classifier for differentiating CD4+ T cell-specific genes from housekeeping genes using HMV data was built. We found HMV in both promoter and gene body regions to be predictive of genes which are targets of TCSR. For example, the histone modification types H3K4me3 and H3K27ac were identified as the most predictive for CpG-related promoters, whereas H3K4me3 and H3K79me3 were the most predictive for nonCpG-related promoters. However, genes targeted by TCSR can be predicted using other type of HMVs as well. Such redundancy implies that multiple type of underlying regulatory elements, such as enhancers or intragenic alternative promoters, which can regulate gene expression in a tissue/cell-type specific fashion, may be marked by the HMVs. Finally, we show that the predictive power of HMV for TCSR is not limited to protein-coding genes in CD4+ T cells, as we successfully predicted TCSR targeted genes in muscle cells, as well as microRNA genes with expression specific to CD4+ T cells, by the same classifier which was trained on HMV data of protein-coding genes in CD4+ T cells. Conclusion We have begun to understand the HMV patterns that guide gene expression in both tissue/cell-type specific and ubiquitous manner.

  13. Ca(2+) coding and decoding strategies for the specification of neural and renal precursor cells during development.

    Science.gov (United States)

    Moreau, Marc; Néant, Isabelle; Webb, Sarah E; Miller, Andrew L; Riou, Jean-François; Leclerc, Catherine

    2016-03-01

    During embryogenesis, a rise in intracellular Ca(2+) is known to be a widespread trigger for directing stem cells towards a specific tissue fate, but the precise Ca(2+) signalling mechanisms involved in achieving these pleiotropic effects are still poorly understood. In this review, we compare the Ca(2+) signalling events that appear to be one of the first steps in initiating and regulating both neural determination (neural induction) and kidney development (nephrogenesis). We have highlighted the necessary and sufficient role played by Ca(2+) influx and by Ca(2+) transients in the determination and differentiation of pools of neural or renal precursors. We have identified new Ca(2+) target genes involved in neural induction and we showed that the same Ca(2+) early target genes studied are not restricted to neural tissue but are also present in other tissues, principally in the pronephros. In this review, we also described a mechanism whereby the transcriptional control of gene expression during neurogenesis and nephrogenesis might be directly controlled by Ca(2+) signalling. This mechanism involves members of the Kcnip family such that a change in their binding properties to specific DNA sites is a result of Ca(2+) binding to EF-hand motifs. The different functions of Ca(2+) signalling during these two events illustrate the versatility of Ca(2+) as a second messenger. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. The Influencing Factors of Cultural Knowledge in Translating Cultural Specific Concepts from Arabic into the English at Jazan University in Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Amin Ali Almubark

    2017-01-01

    Full Text Available This study was set out to explore and evaluate the importance of having mastered the cultural knowledge throughout the process of depicting the culture-specific concepts involving two languages namely Arabic and English. In doing so, the participants sampled in this study were a group of final year Bachelor’s degree students majoring in Translation in ALAradha College at the Jazan University. The findings of the study method employed in this study statistically confirmed that the students of the Translation at AL AlAradh college- Saudi Arabia faced considerable difficulties throughout the process of translating cultural concepts owing to inadequate mastery of knowledge in relation to the culture involved. Among the measures which can be taken in addressing the issues in this context are training the learners by means of exposure to real cases involving culture-specific concepts which may help them deal with such difficulties in the translation process.

  15. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  16. Comparison of 2015 Medicare relative value units for gender-specific procedures: Gynecologic and gynecologic-oncologic versus urologic CPT coding. Has time healed gender-worth?

    Science.gov (United States)

    Benoit, M F; Ma, J F; Upperman, B A

    2017-02-01

    In 1992, Congress implemented a relative value unit (RVU) payment system to set reimbursement for all procedures covered by Medicare. In 1997, data supported that a significant gender bias existed in reimbursement for gynecologic compared to urologic procedures. The present study was performed to compare work and total RVU's for gender specific procedures effective January 2015 and to evaluate if time has healed the gender-based RVU worth. Using the 2015 CPT codes, we compared work and total RVU's for 50 pairs of gender specific procedures. We also evaluated 2015 procedure related provider compensation. The groups were matched so that the procedures were anatomically similar. We also compared 2015 to 1997 RVU and fee schedules. Evaluation of work RVU's for the paired procedures revealed that in 36 cases (72%), male vs female procedures had a higher wRVU and tRVU. For total fee/reimbursement, 42 (84%) male based procedures were compensated at a higher rate than the paired female procedures. On average, male specific surgeries were reimbursed at an amount that was 27.67% higher for male procedures than for female-specific surgeries. Female procedure based work RVU's have increased minimally from 1997 to 2015. Time and effort have trended towards resolution of some gender-related procedure worth discrepancies but there are still significant RVU and compensation differences that should be further reviewed and modified as surgical time and effort highly correlate. Copyright © 2016. Published by Elsevier Inc.

  17. Codes in the codons: construction of a codon/amino acid periodic table and a study of the nature of specific nucleic acid-protein interactions.

    Science.gov (United States)

    Benyo, B; Biro, J C; Benyo, Z

    2004-01-01

    The theory of "codon-amino acid coevolution" was first proposed by Woese in 1967. It suggests that there is a stereochemical matching - that is, affinity - between amino acids and certain of the base triplet sequences that code for those amino acids. We have constructed a common periodic table of codons and amino acids, where the nucleic acid table showed perfect axial symmetry for codons and the corresponding amino acid table also displayed periodicity regarding the biochemical properties (charge and hydrophobicity) of the 20 amino acids and the position of the stop signals. The table indicates that the middle (2/sup nd/) amino acid in the codon has a prominent role in determining some of the structural features of the amino acids. The possibility that physical contact between codons and amino acids might exist was tested on restriction enzymes. Many recognition site-like sequences were found in the coding sequences of these enzymes and as many as 73 examples of codon-amino acid co-location were observed in the 7 known 3D structures (December 2003) of endonuclease-nucleic acid complexes. These results indicate that the smallest possible units of specific nucleic acid-protein interaction are indeed the stereochemically compatible codons and amino acids.

  18. Contribution to the design and realisation of a specific circuit to code the information coming from the calorimeter of the LHC

    International Nuclear Information System (INIS)

    Chambert-Hermel, V.

    1996-01-01

    LHC (Large Hadron Collider) signals required a sampling system with excess of 16 bit dynamic range and 8 hit precision. The sampling frequency is 40 MHz. The use of a floating point format which fits the precision of the calorimeter is proposed. The dynamic range is divided into 8 positive sub-ranges and 5 negative ones and so a conversion into 8 plus 1 (sign) bits mantissa and 4 bits exponent is proposed. The design is built around three main blocks: a range converter which computes the three exponent bits and the sign, a set of amplifiers controlled by the range converter and a classical 8 bit ADC for the mantissa. The main effort was concentrated on the range converter as this is the most sensitive part o the architecture which sees the whole dynamic range. To minimize the problems of perturbations on the signal and reference lines, we have chosen a fully differential sample and hold, differential latched comparators and the coding logic using the AMS BICMOS 1..2 micron technology. We present the floating point format we use, the converter architecture, the elementary circuits steps of conception, the simulation results, the layout and tests results on prototypes. (author)

  19. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  20. Concentration of acrylamide in a polyacrylamide gel affects VP4 gene coding assignment of group A equine rotavirus strains with P[12] specificity

    Science.gov (United States)

    2010-01-01

    Background It is universally acknowledged that genome segment 4 of group A rotavirus, the major etiologic agent of severe diarrhea in infants and neonatal farm animals, encodes outer capsid neutralization and protective antigen VP4. Results To determine which genome segment of three group A equine rotavirus strains (H-2, FI-14 and FI-23) with P[12] specificity encodes the VP4, we analyzed dsRNAs of strains H-2, FI-14 and FI-23 as well as their reassortants by polyacrylamide gel electrophoresis (PAGE) at varying concentrations of acrylamide. The relative position of the VP4 gene of the three equine P[12] strains varied (either genome segment 3 or 4) depending upon the concentration of acrylamide. The VP4 gene bearing P[3], P[4], P[6], P[7], P[8] or P[18] specificity did not exhibit this phenomenon when the PAGE running conditions were varied. Conclusions The concentration of acrylamide in a PAGE gel affected VP4 gene coding assignment of equine rotavirus strains bearing P[12] specificity. PMID:20573245

  1. Self-Concept Predicts Academic Achievement Across Levels of the Achievement Distribution: Domain Specificity for Math and Reading.

    Science.gov (United States)

    Susperreguy, Maria Ines; Davis-Kean, Pamela E; Duckworth, Kathryn; Chen, Meichu

    2017-09-18

    This study examines whether self-concept of ability in math and reading predicts later math and reading attainment across different levels of achievement. Data from three large-scale longitudinal data sets, the Avon Longitudinal Study of Parents and Children, National Institute of Child Health and Human Development-Study of Early Child Care and Youth Development, and Panel Study of Income Dynamics-Child Development Supplement, were used to answer this question by employing quantile regression analyses. After controlling for demographic variables, child characteristics, and early ability, the findings indicate that self-concept of ability in math and reading predicts later achievement in each respective domain across all quantile levels of achievement. These results were replicated across the three data sets representing different populations and provide robust evidence for the role of self-concept of ability in understanding achievement from early childhood to adolescence across the spectrum of performance (low to high). © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  2. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Byamukama, Abdul [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Haiyong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries.

  3. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    International Nuclear Information System (INIS)

    Byamukama, Abdul; Jung, Haiyong

    2014-01-01

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries

  4. Protecting effects specifically from low doses of ionizing radiation to mammalian cells challenge the concept of linearity

    International Nuclear Information System (INIS)

    Feinendegen, L.E.; Sondhaus, C.A.; Altman, K.I.

    1998-01-01

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced changes in intracellular signaling that induce mechanisms of DNA damage control different from those operating at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. The aim of this paper is to demonstrate that by use of microdosimetric concepts, the energy deposited in cell mass can be related to the occurrence of cellular responses, both damaging and defensive

  5. Protecting effects specifically from low doses of ionizing radiation to mammalian cells challenge the concept of linearity

    Energy Technology Data Exchange (ETDEWEB)

    Feinendegen, L.E. [Brookhaven National Lab., Upton, NY (United States). Medical Dept.; Bond, V.P. [Washington State Univ., Richland, WA (United States); Sondhaus, C.A. [Univ. of Arizona, Tucson, AZ (United States). Dept. of Radiology and Radiation Control Office; Altman, K.I. [Univ. of Rochester Medical Center, NY (United States). Dept. of Biochemistry and Biophysics

    1998-12-31

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced changes in intracellular signaling that induce mechanisms of DNA damage control different from those operating at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. The aim of this paper is to demonstrate that by use of microdosimetric concepts, the energy deposited in cell mass can be related to the occurrence of cellular responses, both damaging and defensive.

  6. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  7. The complete mitochondrial genome of the land snail Cornu aspersum (Helicidae: Mollusca: intra-specific divergence of protein-coding genes and phylogenetic considerations within Euthyneura.

    Directory of Open Access Journals (Sweden)

    Juan Diego Gaitán-Espitia

    Full Text Available The complete sequences of three mitochondrial genomes from the land snail Cornu aspersum were determined. The mitogenome has a length of 14050 bp, and it encodes 13 protein-coding genes, 22 transfer RNA genes and two ribosomal RNA genes. It also includes nine small intergene spacers, and a large AT-rich intergenic spacer. The intra-specific divergence analysis revealed that COX1 has the lower genetic differentiation, while the most divergent genes were NADH1, NADH3 and NADH4. With the exception of Euhadra herklotsi, the structural comparisons showed the same gene order within the family Helicidae, and nearly identical gene organization to that found in order Pulmonata. Phylogenetic reconstruction recovered Basommatophora as polyphyletic group, whereas Eupulmonata and Pulmonata as paraphyletic groups. Bayesian and Maximum Likelihood analyses showed that C. aspersum is a close relative of Cepaea nemoralis, and with the other Helicidae species form a sister group of Albinaria caerulea, supporting the monophyly of the Stylommatophora clade.

  8. The Arabidopsis TOR Kinase Specifically Regulates the Expression of Nuclear Genes Coding for Plastidic Ribosomal Proteins and the Phosphorylation of the Cytosolic Ribosomal Protein S6.

    Science.gov (United States)

    Dobrenel, Thomas; Mancera-Martínez, Eder; Forzani, Céline; Azzopardi, Marianne; Davanture, Marlène; Moreau, Manon; Schepetilnikov, Mikhail; Chicher, Johana; Langella, Olivier; Zivy, Michel; Robaglia, Christophe; Ryabova, Lyubov A; Hanson, Johannes; Meyer, Christian

    2016-01-01

    Protein translation is an energy consuming process that has to be fine-tuned at both the cell and organism levels to match the availability of resources. The target of rapamycin kinase (TOR) is a key regulator of a large range of biological processes in response to environmental cues. In this study, we have investigated the effects of TOR inactivation on the expression and regulation of Arabidopsis ribosomal proteins at different levels of analysis, namely from transcriptomic to phosphoproteomic. TOR inactivation resulted in a coordinated down-regulation of the transcription and translation of nuclear-encoded mRNAs coding for plastidic ribosomal proteins, which could explain the chlorotic phenotype of the TOR silenced plants. We have identified in the 5' untranslated regions (UTRs) of this set of genes a conserved sequence related to the 5' terminal oligopyrimidine motif, which is known to confer translational regulation by the TOR kinase in other eukaryotes. Furthermore, the phosphoproteomic analysis of the ribosomal fraction following TOR inactivation revealed a lower phosphorylation of the conserved Ser240 residue in the C-terminal region of the 40S ribosomal protein S6 (RPS6). These results were confirmed by Western blot analysis using an antibody that specifically recognizes phosphorylated Ser240 in RPS6. Finally, this antibody was used to follow TOR activity in plants. Our results thus uncover a multi-level regulation of plant ribosomal genes and proteins by the TOR kinase.

  9. Farm-specific economic value of automatic lameness detection systems in dairy cattle: From concepts to operational simulations.

    Science.gov (United States)

    Van De Gucht, Tim; Saeys, Wouter; Van Meensel, Jef; Van Nuffel, Annelies; Vangeyte, Jurgen; Lauwers, Ludwig

    2018-01-01

    Although prototypes of automatic lameness detection systems for dairy cattle exist, information about their economic value is lacking. In this paper, a conceptual and operational framework for simulating the farm-specific economic value of automatic lameness detection systems was developed and tested on 4 system types: walkover pressure plates, walkover pressure mats, camera systems, and accelerometers. The conceptual framework maps essential factors that determine economic value (e.g., lameness prevalence, incidence and duration, lameness costs, detection performance, and their relationships). The operational simulation model links treatment costs and avoided losses with detection results and farm-specific information, such as herd size and lameness status. Results show that detection performance, herd size, discount rate, and system lifespan have a large influence on economic value. In addition, lameness prevalence influences the economic value, stressing the importance of an adequate prior estimation of the on-farm prevalence. The simulations provide first estimates for the upper limits for purchase prices of automatic detection systems. The framework allowed for identification of knowledge gaps obstructing more accurate economic value estimation. These include insights in cost reductions due to early detection and treatment, and links between specific lameness causes and their related losses. Because this model provides insight in the trade-offs between automatic detection systems' performance and investment price, it is a valuable tool to guide future research and developments. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. New concepts of fluorescent probes for specific detection of DNA sequences: bis-modified oligonucleotides in excimer and exciplex detection.

    Science.gov (United States)

    Gbaj, A; Bichenkova, Ev; Walsh, L; Savage, He; Sardarian, Ar; Etchells, Ll; Gulati, A; Hawisa, S; Douglas, Kt

    2009-12-01

    The detection of single base mismatches in DNA is important for diagnostics, treatment of genetic diseases, and identification of single nucleotide polymorphisms. Highly sensitive, specific assays are needed to investigate genetic samples from patients. The use of a simple fluorescent nucleoside analogue in detection of DNA sequence and point mutations by hybridisation in solution is described in this study. The 5'-bispyrene and 3'-naphthalene oligonucleotide probes form an exciplex on hybridisation to target in water and the 5'-bispyrene oligonucleotide alone is an adequate probe to determine concentration of target present. It was also indicated that this system has a potential to identify mismatches and insertions. The aim of this work was to investigate experimental structures and conditions that permit strong exciplex emission for nucleic acid detectors, and show how such exciplexes can register the presence of mismatches as required in SNP analysis. This study revealed that the hybridisation of 5'-bispyrenyl fluorophore to a DNA target results in formation of a fluorescent probe with high signal intensity change and specificity for detecting a complementary target in a homogeneous system. Detection of SNP mutations using this split-probe system is a highly specific, simple, and accessible method to meet the rigorous requirements of pharmacogenomic studies. Thus, it is possible for the system to act as SNP detectors and it shows promise for future applications in genetic testing.

  11. Designing Of The Concept Of Criminal Income Of Other Persons Legalization (Art. 174 Of The Criminal Code Of The Russian Federation In The Context Of The Systems Theory

    Directory of Open Access Journals (Sweden)

    Veronika A. Abakanova

    2015-03-01

    Full Text Available In the present article assumptions of the theory concept of forensic crime, which were developed in the forensic science is being analyzed. In the study, author shows possibility of the theory of systems using for the construction of the concept of forensic crime. Author distributes systems theory on money laundering by the alius. Author examined structure of forensic money laundering by alius and described elements of the structure, their internal and external relationships and patterns, as well as mechanisms to ensure its integrity. Author suggests the following set of elements of money laundering by alius: object of direct attacks, subject of the attacks, physical attacks on the activity of the subject of money laundering by alius, mental activity of the subject of infringement, facts, consequences of money laundering by alius, time and place of money laundering other persons, public danger and wrongfulness of the act. During the study, author starts the discussion, as the starting positions allow us to consider money laundering as a complex system that has a "mechanism", ensuring its integrity and diverse types of bonds. Author notes that in the absence of this complex system, at least one of these elements is the complete destruction of the whole system of crime (lack of it. In conclusion, author proposes the concept of forensic money laundering by alius. Presented forensic structure of money laundering by alius and description of the elements of the structure, as well as their internal and external relationships and patterns, what gives the practitioners a chance to have a full picture of the crime and, therefore, to navigate freely in the initial information on the crime and to understand it correctly.

  12. New Concepts of Fluorescent Probes for Specific Detection of DNA Sequences: Bis-Modified Oligonucleotides in Excimer and Exciplex Detection

    Directory of Open Access Journals (Sweden)

    Gbaj A

    2009-01-01

    Full Text Available The detection of single base mismatches in DNA is important for diagnostics, treatment of genetic diseases, and identification of single nucleotide polymorphisms. Highly sensitive, specific assays are needed to investigate genetic samples from patients. The use of a simple fluorescent nucleoside analogue in detection of DNA sequence and point mutations by hybridisation in solution is described in this study. The 5’-bispyrene and 3’-naphthalene oligonucleotide probes form an exciplex on hybridisation to target in water and the 5’-bispyrene oligonucleotide alone is an adequate probe to determine concentration of target present. It was also indicated that this system has a potential to identify mismatches and insertions. The aim of this work was to investigate experimental structures and conditions that permit strong exciplex emission for nucleic acid detectors, and show how such exciplexes can register the presence of mismatches as required in SNP analysis. This study revealed that the hybridisation of 5'-bispyrenyl fluorophore to a DNA target results in formation of a fluorescent probe with high signal intensity change and specificity for detecting a complementary target in a homogeneous system. Detection of SNP mutations using this split-probe system is a highly specific, simple, and accessible method to meet the rigorous requirements of pharmacogenomic studies. Thus, it is possible for the system to act as SNP detectors and it shows promise for future applications in genetic testing.

  13. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    Science.gov (United States)

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image

  14. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  15. Comprehensive search for intra- and inter-specific sequence polymorphisms among coding envelope genes of retroviral origin found in the human genome: genes and pseudogenes

    Directory of Open Access Journals (Sweden)

    Vasilescu Alexandre

    2005-09-01

    Full Text Available Abstract Background The human genome carries a high load of proviral-like sequences, called Human Endogenous Retroviruses (HERVs, which are the genomic traces of ancient infections by active retroviruses. These elements are in most cases defective, but open reading frames can still be found for the retroviral envelope gene, with sixteen such genes identified so far. Several of them are conserved during primate evolution, having possibly been co-opted by their host for a physiological role. Results To characterize further their status, we presently sequenced 12 of these genes from a panel of 91 Caucasian individuals. Genomic analyses reveal strong sequence conservation (only two non synonymous Single Nucleotide Polymorphisms [SNPs] for the two HERV-W and HERV-FRD envelope genes, i.e. for the two genes specifically expressed in the placenta and possibly involved in syncytiotrophoblast formation. We further show – using an ex vivo fusion assay for each allelic form – that none of these SNPs impairs the fusogenic function. The other envelope proteins disclose variable polymorphisms, with the occurrence of a stop codon and/or frameshift for most – but not all – of them. Moreover, the sequence conservation analysis of the orthologous genes that can be found in primates shows that three env genes have been maintained in a fully coding state throughout evolution including envW and envFRD. Conclusion Altogether, the present study strongly suggests that some but not all envelope encoding sequences are bona fide genes. It also provides new tools to elucidate the possible role of endogenous envelope proteins as susceptibility factors in a number of pathologies where HERVs have been suspected to be involved.

  16. Lateral Concepts

    DEFF Research Database (Denmark)

    Gad, Christopher; Bruun Jensen, casper

    2016-01-01

    This essay discusses the complex relation between the knowledges and practices of the researcher and his/her informants in terms of lateral concepts. The starting point is that it is not the prerogative of the (STS) scholar to conceptualize the world; all our “informants” do it too. This creates...... the possibility of enriching our own conceptual repertoires by letting them be inflected by the concepts of those we study. In a broad sense, the lateral means that there is a many-to-many relation between domains of knowledge and practice. However, each specific case of the lateral is necessarily immanent...... to a particular empirical setting and form of inquiry. In this sense lateral concepts are radically empirical since it locates concepts within the field. To clarify the meaning and stakes of lateral concepts, we first make a contrast between lateral anthropology and Latour’s notion of infra-reflexivity. We end...

  17. Error-correction coding for digital communications

    Science.gov (United States)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  18. Outcomes important to burns patients during scar management and how they compare to the concepts captured in burn-specific patient reported outcome measures.

    Science.gov (United States)

    Jones, Laura L; Calvert, Melanie; Moiemen, Naiem; Deeks, Jonathan J; Bishop, Jonathan; Kinghorn, Philip; Mathers, Jonathan

    2017-12-01

    Pressure garment therapy (PGT) is an established treatment for the prevention and treatment of hypertrophic scarring; however, there is limited evidence for its effectiveness. Burn survivors often experience multiple issues many of which are not adequately captured in current PGT trial measures. To assess the effectiveness of PGT it is important to understand what outcomes matter to patients and to consider whether patient-reported outcome measures (PROMs) can be used to ascertain the effect of treatments on patients' health-related quality of life. This study aimed to (a) understand the priorities and perspectives of adult burns patients and the parents of burns patients who have experienced PGT via in-depth qualitative data, and (b) compare these with the concepts captured within burn-specific PROMs. We undertook 40 semi-structured interviews with adults and parents of paediatric and adolescent burns patients who had experienced PGT to explore their priorities and perspectives on scar management. Interviews were audio-recorded, transcribed and thematically analysed. The outcomes interpreted within the interview data were then mapped against the concepts captured within burn-specific PROMs currently in the literature. Eight core outcome domains were identified as important to adult patients and parents: (1) scar characteristics and appearance, (2) movement and function, (3) scar sensation, (4) psychological distress, adjustments and a sense of normality, (5) body image and confidence, (6) engagement in activities, (7) impact on relationships, and (8) treatment burden. The outcome domains presented reflect a complex holistic patient experience of scar management and treatments such as PGT. Some currently available PROMs do capture the concepts described here, although none assess psychological adjustments and attainment of a sense of normality following burn injury. The routine use of PROMs that represent patient experience and their relative contribution to trial

  19. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  20. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  1. Social phobia, anxiety, oppositional behavior, social skills, and self-concept in children with specific selective mutism, generalized selective mutism, and community controls.

    Science.gov (United States)

    Cunningham, Charles E; McHolm, Angela E; Boyle, Michael H

    2006-08-01

    We compared social phobia, anxiety, oppositional behavior, social skills, and self-concept in three groups: (1) 28 children with specific mutism (who did not speak to teachers but were more likely to speak to parents and peers at home and school); (2) 30 children with generalized mutism (whose speaking was restricted primarily to their homes); and (3) 52 community controls. Children with generalized mutism evidenced higher anxiety at school, and more separation anxiety, OCD, and depressive symptoms at home. Parents and teachers reported that the social phobia and anxiety scores of children in both the specific and generalized mutism subgroups were higher than controls. Children in both the specific and generalized mutism groups evidenced greater deficits in verbal and nonverbal social skills at home and school than controls. Teachers and parents did not report differences in nonverbal measures of social cooperation and conflict resolution and we found no evidence that selective mutism was linked to an increase in externalizing problems such as oppositional behavior or ADHD. Although children with specific mutism speak in a wider range of situations and appear less anxious to their teachers than children with generalized mutism, significant socially phobic behavior and social skills deficits are present in both groups.

  2. The materials programme for the high-temperature gas-cooled reactor in the Federal Republic of Germany: Status of the development of high-temperature materials, integrity concept, and design codes

    International Nuclear Information System (INIS)

    Nickel, H.; Bodmann, E.; Seehafer, H.J.

    1990-01-01

    During the last 15 years, the research and development of materials for high temperature gas-cooled reactor (HTGR) applications in the Federal Republic of Germany have been concentrated on the qualification of high-temperature structural alloys. Such materials are required for heat exchanger components of advanced HTGRs supplying nuclear process heat in the temperature range between 750 deg. and 950 deg. C. The suitability of the candidate alloys for service in the HTGR has been established, and continuing research is aimed at verification of the integrity of components over the envisaged service lifetimes. The special features of the HTGR which provide a high degree of safety are the use of ceramics for the core construction and the low power density of the core. The reactor integrity concept which has been developed is based on these two characteristics. Previously, technical guidelines and design codes for nuclear plants were tailored exclusively to light water reactor systems. An extensive research project was therefore initiated which led to the formulation of the basic principles on which a high temperature design code can be based. (author)

  3. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  4. Specification of a test problem for HYDROCOIN [Hydrologic Code Intercomparison] Level 3 Case 2: Sensitivity analysis for deep disposal in partially saturated, fractured tuff

    International Nuclear Information System (INIS)

    Prindle, R.W.

    1987-08-01

    The international Hydrologic Code Intercomparison Project (HYDROCOIN) was formed to evaluate hydrogeologic models and computer codes and their use in performance assessment for high-level radioactive waste repositories. Three principal activities in the HYDROCOIN Project are Level 1, verification and benchmarking of hydrologic codes; Level 2, validation of hydrologic models; and Level 3, sensitivity and uncertainty analyses of the models and codes. This report presents a test case defined for the HYDROCOIN Level 3 activity to explore the feasibility of applying various sensitivity-analysis methodologies to a highly nonlinear model of isothermal, partially saturated flow through fractured tuff, and to develop modeling approaches to implement the methodologies for sensitivity analysis. These analyses involve an idealized representation of a repository sited above the water table in a layered sequence of welded and nonwelded, fractured, volcanic tuffs. The analyses suggested here include one-dimensional, steady flow; one-dimensional, nonsteady flow; and two-dimensional, steady flow. Performance measures to be used to evaluate model sensitivities are also defined; the measures are related to regulatory criteria for containment of high-level radioactive waste. 14 refs., 5 figs., 4 tabs

  5. The Drosophila genes CG14593 and CG30106 code for G-protein-coupled receptors specifically activated by the neuropeptides CCHamide-1 and CCHamide-2

    DEFF Research Database (Denmark)

    Hansen, Karina K; Hauser, Frank; Williamson, Michael

    2011-01-01

    Recently, a novel neuropeptide, CCHamide, was discovered in the silkworm Bombyx mori (L. Roller et al., Insect Biochem. Mol. Biol. 38 (2008) 1147-1157). We have now found that all insects with a sequenced genome have two genes, each coding for a different CCHamide, CCHamide-1 and -2. We have also...

  6. Subject-specific cardiovascular system model-based identification and diagnosis of septic shock with a minimally invasive data set: animal experiments and proof of concept

    International Nuclear Information System (INIS)

    Geoffrey Chase, J; Starfinger, Christina; Hann, Christopher E; Lambermont, Bernard; Ghuysen, Alexandre; Kolh, Philippe; Dauby, Pierre C; Desaive, Thomas; Shaw, Geoffrey M

    2011-01-01

    A cardiovascular system (CVS) model and parameter identification method have previously been validated for identifying different cardiac and circulatory dysfunctions in simulation and using porcine models of pulmonary embolism, hypovolemia with PEEP titrations and induced endotoxic shock. However, these studies required both left and right heart catheters to collect the data required for subject-specific monitoring and diagnosis—a maximally invasive data set in a critical care setting although it does occur in practice. Hence, use of this model-based diagnostic would require significant additional invasive sensors for some subjects, which is unacceptable in some, if not all, cases. The main goal of this study is to prove the concept of using only measurements from one side of the heart (right) in a 'minimal' data set to identify an effective patient-specific model that can capture key clinical trends in endotoxic shock. This research extends existing methods to a reduced and minimal data set requiring only a single catheter and reducing the risk of infection and other complications—a very common, typical situation in critical care patients, particularly after cardiac surgery. The extended methods and assumptions that found it are developed and presented in a case study for the patient-specific parameter identification of pig-specific parameters in an animal model of induced endotoxic shock. This case study is used to define the impact of this minimal data set on the quality and accuracy of the model application for monitoring, detecting and diagnosing septic shock. Six anesthetized healthy pigs weighing 20–30 kg received a 0.5 mg kg −1 endotoxin infusion over a period of 30 min from T0 to T30. For this research, only right heart measurements were obtained. Errors for the identified model are within 8% when the model is identified from data, re-simulated and then compared to the experimentally measured data, including measurements not used in the

  7. Distributed space-time coding

    CERN Document Server

    Jing, Yindi

    2014-01-01

    Distributed Space-Time Coding (DSTC) is a cooperative relaying scheme that enables high reliability in wireless networks. This brief presents the basic concept of DSTC, its achievable performance, generalizations, code design, and differential use. Recent results on training design and channel estimation for DSTC and the performance of training-based DSTC are also discussed.

  8. A compendium of computer codes in fault tree analysis

    International Nuclear Information System (INIS)

    Lydell, B.

    1981-03-01

    In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)

  9. Ethical codes in business practice

    OpenAIRE

    Kobrlová, Marie

    2013-01-01

    The diploma thesis discusses the issues of ethics and codes of ethics in business. The theoretical part defines basic concepts of ethics, presents its historical development and the methods and tools of business ethics. It also focuses on ethical codes and the area of law and ethics. The practical part consists of a quantitative survey, which provides views of selected business entities of business ethics and the use of codes of ethics in practice.

  10. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  11. QR CODE IN LIBRARY PRACTICE SOME EXAMPLES

    OpenAIRE

    Ajay Shanker Mishra*, Sachin Kumar Umre, Pavan Kumar Gupta

    2017-01-01

    Quick Response (QR) code is one such technology which can cater to the user demand of providing access to resources through mobile. The main objective of this article to review the concept of Quick Response Code (QR code) and describe the practice of reading and generating QR codes. Research paper attempt to the basic concept, structure, technological pros and cons of the QR code. The literature is filled with potential uses for Quick Response (QR) codes in the library practices like e-resour...

  12. Preliminary Safety Analysis of the Gorleben Site: Safety Concept and Application to Scenario Development Based on a Site-Specific Features, Events and Processes (FEP) Database - 13304

    Energy Technology Data Exchange (ETDEWEB)

    Moenig, Joerg; Beuth, Thomas; Wolf, Jens [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Theodor-Heuss-Str. 4, D-38122 Braunschweig (Germany); Lommerzheim, Andre [DBE TECHNOLOGY GmbH, Eschenstr. 55, D-31224 Peine (Germany); Mrugalla, Sabine [Federal Institute for Geosciences and Natural Resources (BGR), Stilleweg 2, D-30655 Hannover (Germany)

    2013-07-01

    Based upon the German safety criteria, released in 2010 by the Federal Ministry of the Environment (BMU), a safety concept and a safety assessment concept for the disposal of heat-generating high-level waste have both been developed in the framework of the preliminary safety case for the Gorleben site (Project VSG). The main objective of the disposal is to contain the radioactive waste inside a defined rock zone, which is called containment-providing rock zone. The radionuclides shall remain essentially at the emplacement site, and at the most, a small defined quantity of material shall be able to leave this rock zone. This shall be accomplished by the geological barrier and a technical barrier system, which is required to seal the inevitable penetration of the geological barrier by the construction of the mine. The safe containment has to be demonstrated for probable and less probable evolutions of the site, while evolutions with very low probability (less than 1 % over the demonstration period of 1 million years) need not to be considered. Owing to the uncertainty in predicting the real evolution of the site, plausible scenarios have been derived in a systematic manner. Therefore, a comprehensive site-specific features, events and processes (FEP) data base for the Gorleben site has been developed. The safety concept was directly taken into account, e.g. by identification of FEP with direct influence on the barriers that provide the containment. No effort was spared to identify the interactions of the FEP, their probabilities of occurrence, and their characteristics (values). The information stored in the data base provided the basis for the development of scenarios. The scenario development methodology is based on FEP related to an impairment of the functionality of a subset of barriers, called initial barriers. By taking these FEP into account in their probable characteristics the reference scenario is derived. Thus, the reference scenario describes a

  13. Computer codes for ventilation in nuclear facilities

    International Nuclear Information System (INIS)

    Mulcey, P.

    1987-01-01

    In this paper the authors present some computer codes, developed in the last years, for ventilation and radioprotection. These codes are used for safety analysis in the conception, exploitation and dismantlement of nuclear facilities. The authors present particularly: DACC1 code used for aerosol deposit in sampling circuit of radiation monitors; PIAF code used for modelization of complex ventilation system; CLIMAT 6 code used for optimization of air conditioning system [fr

  14. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  15. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  16. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  17. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  18. Coding interview questions concepts, problems, interview questions

    CERN Document Server

    Karumanchi, Narasimha

    2016-01-01

    Peeling Data Structures and Algorithms: * Programming puzzles for interviews * Campus Preparation * Degree/Masters Course Preparation * Instructor’s * GATE Preparation * Big job hunters: Microsoft, Google, Amazon, Yahoo, Flip Kart, Adobe, IBM Labs, Citrix, Mentor Graphics, NetApp, Oracle, Webaroo, De-Shaw, Success Factors, Face book, McAfee and many more * Reference Manual for working people

  19. Concept Mapping zur Unterstützung der differentialdiagnostischen Hypothesenbildung im fallbasierten Online-Lernsystem CASUS: Qualitative Verbesserung der Diagnosefindung durch ICD-10 Kodierung [Concept mapping for supporting the differential diagnostic generation of hypotheses in the case-based online learning system CASUS: Qualitative improvement of dagnostic performance through ICD-10 coding

    Directory of Open Access Journals (Sweden)

    Kernt, Marcus

    2008-08-01

    Full Text Available [english] Introduction: Concept mapping tools have long been established in medical education as an aid for visualizing learning processes in computer-based programs. The case-based learning system CASUS with its mapping tool for visualizing the differential diagnostic reasoning process is an example. It was shown that such tools are well accepted by users and lead to an increased number of diagnostic hypotheses being visualized as maps. However, there is scarce evidence on the quality of user-generated diagnostic hypotheses. This study examines the quality of diagnostic hypotheses obtained with CASUS and whether the quality can be improved through ICD-10 coding as compared with an expert’s solution. Methods: We randomized 192 third-year medical students at the University of Munich into two groups. The students worked in groups of two on one computer. Group A was asked to code their diagnostic hypotheses with an ICD-10 coding browser before entering them into the mapping tool. Group B generated their hypotheses without prior ICD-10 coding. The differential diagnostic reasoning visualizations were analyzed quantitatively and qualitatively. An expert solution was used as reference. Results: Eighty-seven differential diagnoses were evaluated. Group A, using ICD-10 coding, made the correct and precise diagnosis of malaria tropica significantly more often than Group B (p < 0.05. For additional alternative diagnostic hypotheses, no quantitative or qualitative differences were detected. Conclusions: ICD-10 coding in connection with a mapping tool supporting the diagnostic reasoning process improved the accuracy of diagnostic performance in third-year medical students in the case of malaria tropica. [german] Einleitung: Der Einsatz von Concept-Mapping-Tools in computergestützten Lernprogrammen ist in der medizinischen Ausbildung etabliert: Es konnte gezeigt werden, dass diese Werkzeuge zur Visualisierung von Differentialdiagnosen vom Anwender

  20. Time-of-flight data acquisition unit (DAU) for neutron scattering experiments. Specification of the requirements and design concept. Version 3.1

    International Nuclear Information System (INIS)

    Herdam, G.; Klessmann, H.; Wawer, W.; Adebayo, J.; David, G.; Szatmari, F.

    1989-12-01

    This specification describes the requirements for the Data Acquisition Unit (DAU) and defines the design concept for the functional units involved. The Data Acquisition Unit will be used in the following neutron scattering experiments: Time-of-Flight Spectrometer NEAT, Time-of-Flight Spectrometer SPAN. In addition, the data of the SPAN spectrometer in Spin Echo experiments will be accumulated. The Data Acquisition Unit can be characterised by the following requirements: Time-of-flight measurement with high time resolution (125 ns), sorting the time-of-flight in up to 4096 time channels (channel width ≥ 1 μs), selection of different time channel widths for peak and background, on-line time-of-flight correction for neutron flight paths of different lengths, sorting the detector position information in up to 4096 position channels, accumulation of two-dimensional spectra in a 32 Mbyte RAM memory (4 K time channels*4 K position channels*16 bits). Because of the stringent timing requirements the functional units of the DAU are hardware controlled via tables. The DAU is part of a process control system which has access to the functional units via the VMEbus in order to initialise, to load tables and control information, and to read status information and spectra. (orig.) With 18 figs

  1. Trust, Personal Moral Codes, and the Resource-Advantage Theory of Competition: Explaining Productivity, Economic Growth, and Wealth Creation

    Directory of Open Access Journals (Sweden)

    Shelby D. Hunt

    2012-06-01

    Full Text Available Scholars agree that societal-level moral codes that promote social trust also promote wealth creation.  However, what specific kinds of societal-level moral codes promote social trust?  Also, by what specific kind of competitive process does social trust promote wealth creation?  Because societal-level moral codes are composed of or formed from peoples’ personal moral codes, this article explores a theory of ethics, known as the “Hunt-Vitell” theory of ethics, that illuminates the concept of personal moral codes and uses the theory to discuss which types of personal moral codes foster trust and distrust in society.  This article then uses resource-advantage (R-A theory, one of the most completely articulated dynamic theories of competition, to show the process by which trust-promoting, societal-level moral codes promote productivity and economic growth.  That is, they promote wealth creation.

  2. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  3. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  4. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  5. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, transparency, neutrality, impartiality, effectiveness, accountability, and legality. The normative context of public administration, as expressed in codes, seems to ignore the New Public Management and Reinventing Government reform movements....

  6. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  7. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  8. Universals and Specifics of Math Self-Concept, Math Self-Efficacy, and Math Anxiety across 41 PISA 2003 Participating Countries

    Science.gov (United States)

    Lee, Jihyun

    2009-01-01

    The overarching goal of the present study is to investigate the factorial structure of three closely related constructs: math self-concept, math self-efficacy, and math anxiety. The factorial structure consisting of three factors, each representing math self-concept, math self-efficacy, and math anxiety, is supported in all 41 countries employed…

  9. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  10. The general theory of convolutional codes

    Science.gov (United States)

    Mceliece, R. J.; Stanley, R. P.

    1993-01-01

    This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.

  11. The Coding Process and Its Challenges

    Directory of Open Access Journals (Sweden)

    Judith A. Holton, Ph.D.

    2010-02-01

    Full Text Available Coding is the core process in classic grounded theory methodology. It is through coding that the conceptual abstraction of data and its reintegration as theory takes place. There are two types of coding in a classic grounded theory study: substantive coding, which includes both open and selective coding procedures, and theoretical coding. In substantive coding, the researcher works with the data directly, fracturing and analysing it, initially through open coding for the emergence of a core category and related concepts and then subsequently through theoretical sampling and selective coding of data to theoretically saturate the core and related concepts. Theoretical saturation is achieved through constant comparison of incidents (indicators in the data to elicit the properties and dimensions of each category (code. This constant comparing of incidents continues until the process yields the interchangeability of indicators, meaning that no new properties or dimensions are emerging from continued coding and comparison. At this point, the concepts have achieved theoretical saturation and the theorist shifts attention to exploring the emergent fit of potential theoretical codes that enable the conceptual integration of the core and related concepts to produce hypotheses that account for relationships between the concepts thereby explaining the latent pattern of social behaviour that forms the basis of the emergent theory. The coding of data in grounded theory occurs in conjunction with analysis through a process of conceptual memoing, capturing the theorist’s ideation of the emerging theory. Memoing occurs initially at the substantive coding level and proceeds to higher levels of conceptual abstraction as coding proceeds to theoretical saturation and the theorist begins to explore conceptual reintegration through theoretical coding.

  12. A DOE Computer Code Toolbox: Issues and Opportunities

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2001-01-01

    The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications

  13. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  14. System Design Description for the TMAD Code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System

  15. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  16. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  17. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  18. Meaning and effect of a music concept designed specifically to promote general wellbeing and health for children with cancer receiving music therapy

    OpenAIRE

    Sanfi, Ilan

    2012-01-01

    This text consists of an extract of my master´s thesis (Sanfi, 2007). It discusses a music concept, which is designed to meet bodily and psychosocial needs of children with cancer when receiving chemotherapy. The theory part relates to the development of the special designed music concept, while the empirical part relates to a supplementary pilot study in which a mixed methods design (i.e. semi-structured interviews and questionnaires) is applied. The combination of methods intends to explore...

  19. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  20. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  1. Contribution to the study of {sup 233}U production with MOX-ThPu fuel in PWR reactor. Transition scenarios towards Th/{sup 233}U iso-generating concepts in thermal spectrum. Development of the MURE fuel evolution code; Contribution a l'etude de la production d'{sup 233}U en combustible MOX-ThPu en reacteur a eau sous pression. Scenarios de transition vers des concepts isogenerateurs Th/{sup 233}U en spectre thermique. Developpement du code MURE d'evolution du combustible

    Energy Technology Data Exchange (ETDEWEB)

    Michel-Sendis, F

    2006-12-15

    If nuclear power is to provide a significant fraction of the growing world energy demand, only through the breeding concept can the development of sustainable nuclear energy become a reality. The study of such a transition, from present-day nuclear technologies to future breeding concepts is therefore pertinent. Among these future concepts, those using the thorium cycle Th/U-233 in a thermal neutron spectrum are of particular interest; molten-salt type thermal reactors would allow for breeding while requiring comparatively low initial inventories of U-233. The upstream production of U-233 can be obtained through the use of thorium-plutonium mixed oxide fuel in present-day light water reactors. This work presents, firstly, the development of the MURE evolution code system, a C++ object-oriented code that allows the study, through Monte Carlo (M.C.) simulation, of nuclear reactors and the evolution of their fuel under neutron irradiation. The M.C. methods are well-suited for the study of any reactor, whether it'd be an existing reactor using a new kind of fuel or a future concept altogether, the simulation is only dependent on nuclear data. Exact and complex geometries can be simulated and continuous energy particle transport is performed. MURE is an interface with MCNP, the well-known and validated transport code, that allows, among other functionalities, to simulate constant power and constant reactivity evolutions. Secondly, the study of MOX ThPu fuel in a conventional light water reactor (REP) is presented; it explores different plutonium concentrations and isotopic qualities in order to evaluate their safety characteristics. Simulation of their evolution allows us to quantify the production of U-233 at the end of burnup. Last, different french scenarios validating a possible transition towards a park of thermal Th/U-233 breeders, are presented. In these scenarios, U-233 is produced in ThPu moxed light water reactors. (author)

  2. Contribution to the study of {sup 233}U production with MOX-ThPu fuel in PWR reactor. Transition scenarios towards Th/{sup 233}U iso-generating concepts in thermal spectrum. Development of the MURE fuel evolution code; Contribution a l'etude de la production d'{sup 233}U en combustible MOX-ThPu en reacteur a eau sous pression. Scenarios de transition vers des concepts isogenerateurs Th/{sup 233}U en spectre thermique. Developpement du code MURE d'evolution du combustible

    Energy Technology Data Exchange (ETDEWEB)

    Michel-Sendis, F

    2006-12-15

    If nuclear power is to provide a significant fraction of the growing world energy demand, only through the breeding concept can the development of sustainable nuclear energy become a reality. The study of such a transition, from present-day nuclear technologies to future breeding concepts is therefore pertinent. Among these future concepts, those using the thorium cycle Th/U-233 in a thermal neutron spectrum are of particular interest; molten-salt type thermal reactors would allow for breeding while requiring comparatively low initial inventories of U-233. The upstream production of U-233 can be obtained through the use of thorium-plutonium mixed oxide fuel in present-day light water reactors. This work presents, firstly, the development of the MURE evolution code system, a C++ object-oriented code that allows the study, through Monte Carlo (M.C.) simulation, of nuclear reactors and the evolution of their fuel under neutron irradiation. The M.C. methods are well-suited for the study of any reactor, whether it'd be an existing reactor using a new kind of fuel or a future concept altogether, the simulation is only dependent on nuclear data. Exact and complex geometries can be simulated and continuous energy particle transport is performed. MURE is an interface with MCNP, the well-known and validated transport code, that allows, among other functionalities, to simulate constant power and constant reactivity evolutions. Secondly, the study of MOX ThPu fuel in a conventional light water reactor (REP) is presented; it explores different plutonium concentrations and isotopic qualities in order to evaluate their safety characteristics. Simulation of their evolution allows us to quantify the production of U-233 at the end of burnup. Last, different french scenarios validating a possible transition towards a park of thermal Th/U-233 breeders, are presented. In these scenarios, U-233 is produced in ThPu moxed light water reactors. (author)

  3. Astrophysical Concepts

    CERN Document Server

    Harwit, Martin

    2006-01-01

    This classic text, aimed at senior undergraduates and beginning graduate students in physics and astronomy, presents a wide range of astrophysical concepts in sufficient depth to give the reader a quantitative understanding of the subject. Emphasizing physical concepts, the book outlines cosmic events but does not portray them in detail: it provides a series of astrophysical sketches. For this fourth edition, nearly every part of the text has been reconsidered and rewritten, new sections have been added to cover recent developments, and others have been extensively revised and brought up to date. The book begins with an outline of the scope of modern astrophysics and enumerates some of the outstanding problems faced in the field today. The basic physics needed to tackle these questions are developed in the next few chapters using specific astronomical processes as examples. The second half of the book enlarges on these topics and shows how we can obtain quantitative insight into the structure and evolution of...

  4. Multimedia signal coding and transmission

    CERN Document Server

    Ohm, Jens-Rainer

    2015-01-01

    This textbook covers the theoretical background of one- and multidimensional signal processing, statistical analysis and modelling, coding and information theory with regard to the principles and design of image, video and audio compression systems. The theoretical concepts are augmented by practical examples of algorithms for multimedia signal coding technology, and related transmission aspects. On this basis, principles behind multimedia coding standards, including most recent developments like High Efficiency Video Coding, can be well understood. Furthermore, potential advances in future development are pointed out. Numerous figures and examples help to illustrate the concepts covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by exercises. The book is also a self-contained introduction both for researchers and developers of multimedia compression systems in industry.

  5. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  6. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  7. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  8. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  9. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  10. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  11. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  12. Evaluating and integrating corporate social responsibility standards: Implications for CSR concepts

    Directory of Open Access Journals (Sweden)

    Markus Stiglbauer

    2012-03-01

    Full Text Available Standards play a major role when concepts of corporate social responsibility (CSR ought to be implemented and corporate social performance (CSP ought to be assessed. Ethical reasoning and stakeholders’ expectations help to measure companies’ intentions to implement CSR standards and to measure their efficiency. With different standards of CSR (company standards, industry standards, multi-stakeholder standards and independent standards companies may implement we categorize and еvaluate those standards and give advice which opportunities but also threats may arise for companies when implementing such codes within firm-specific CSR concepts. We suggest a combination of different standards and replenish them with firm-specific codes of conduct.

  13. Pump Component Model in SPACE Code

    International Nuclear Information System (INIS)

    Kim, Byoung Jae; Kim, Kyoung Doo

    2010-08-01

    This technical report describes the pump component model in SPACE code. A literature survey was made on pump models in existing system codes. The models embedded in SPACE code were examined to check the confliction with intellectual proprietary rights. Design specifications, computer coding implementation, and test results are included in this report

  14. Adaptive Evolution Coupled with Retrotransposon Exaptation Allowed for the Generation of a Human-Protein-Specific Coding Gene That Promotes Cancer Cell Proliferation and Metastasis in Both Haematological Malignancies and Solid Tumours: The Extraordinary Case of MYEOV Gene

    Directory of Open Access Journals (Sweden)

    Spyros I. Papamichos

    2015-01-01

    Full Text Available The incidence of cancer in human is high as compared to chimpanzee. However previous analysis has documented that numerous human cancer-related genes are highly conserved in chimpanzee. Till date whether human genome includes species-specific cancer-related genes that could potentially contribute to a higher cancer susceptibility remains obscure. This study focuses on MYEOV, an oncogene encoding for two protein isoforms, reported as causally involved in promoting cancer cell proliferation and metastasis in both haematological malignancies and solid tumours. First we document, via stringent in silico analysis, that MYEOV arose de novo in Catarrhini. We show that MYEOV short-isoform start codon was evolutionarily acquired after Catarrhini/Platyrrhini divergence. Throughout the course of Catarrhini evolution MYEOV acquired a gradually elongated translatable open reading frame (ORF, a gradually shortened translation-regulatory upstream ORF, and alternatively spliced mRNA variants. A point mutation introduced in human allowed for the acquisition of MYEOV long-isoform start codon. Second, we demonstrate the precious impact of exonized transposable elements on the creation of MYEOV gene structure. Third, we highlight that the initial part of MYEOV long-isoform coding DNA sequence was under positive selection pressure during Catarrhini evolution. MYEOV represents a Primate Orphan Gene that acquired, via ORF expansion, a human-protein-specific coding potential.

  15. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  16. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  17. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Smith, A.B. [ed.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  18. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  19. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  20. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  1. Social Comparison and Big-Fish-Little-Pond Effects on Self-Concept and Other Self-Belief Constructs: Role of Generalized and Specific Others

    Science.gov (United States)

    Marsh, Herbert W.; Trautwein, Ulrich; Ludtke, Oliver; Koller, Olaf

    2008-01-01

    Two studies integrate the big-fish-little-pond effect (BFLPE; negative effects of class-average achievement on academic self-concept, ASC), which is based upon educational psychological research, with related social psychological research that is based on social comparison theory. Critical distinctions are the nature of the social comparison…

  2. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  3. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  4. The nuclear codes and guidelines

    International Nuclear Information System (INIS)

    Sonter, M.

    1984-01-01

    This paper considers problems faced by the mining industry when implementing the nuclear codes of practice. Errors of interpretation are likely. A major criticism is that the guidelines to the codes must be seen as recommendations only. They are not regulations. Specific clauses in the guidelines are criticised

  5. Applying Physical-Layer Network Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Liew SoungChang

    2010-01-01

    Full Text Available A main distinguishing feature of a wireless network compared with a wired network is its broadcast nature, in which the signal transmitted by a node may reach several other nodes, and a node may receive signals from several other nodes, simultaneously. Rather than a blessing, this feature is treated more as an interference-inducing nuisance in most wireless networks today (e.g., IEEE 802.11. This paper shows that the concept of network coding can be applied at the physical layer to turn the broadcast property into a capacity-boosting advantage in wireless ad hoc networks. Specifically, we propose a physical-layer network coding (PNC scheme to coordinate transmissions among nodes. In contrast to "straightforward" network coding which performs coding arithmetic on digital bit streams after they have been received, PNC makes use of the additive nature of simultaneously arriving electromagnetic (EM waves for equivalent coding operation. And in doing so, PNC can potentially achieve 100% and 50% throughput increases compared with traditional transmission and straightforward network coding, respectively, in 1D regular linear networks with multiple random flows. The throughput improvements are even larger in 2D regular networks: 200% and 100%, respectively.

  6. Radiological impact assessment in Malaysia using RESRAD computer code

    International Nuclear Information System (INIS)

    Syed Hakimi Sakuma Syed Ahmad; Khairuddin Mohamad Kontol; Razali Hamzah

    1999-01-01

    Radiological Impact Assessment (RIA) can be conducted in Malaysia by using the RESRAD computer code developed by Argonne National Laboratory, U.S.A. The code can do analysis to derive site specific guidelines for allowable residual concentrations of radionuclides in soil. Concepts of the RIA in the context of waste management concern in Malaysia, some regulatory information and assess status of data collection are shown. Appropriate use scenarios and site specific parameters are used as much as possible so as to be realistic so that will reasonably ensure that individual dose limits and or constraints will be achieved. Case study have been conducted to fulfil Atomic Energy Licensing Board (AELB) requirements where for disposal purpose the operator must be required to carry out. a radiological impact assessment to all proposed disposals. This is to demonstrate that no member of public will be exposed to more than 1 mSv/year from all activities. Results obtained from analyses show the RESRAD computer code is able to calculate doses, risks, and guideline values. Sensitivity analysis by the computer code shows that the parameters used as input are justified so as to improve confidence to the public and the AELB the results of the analysis. The computer code can also be used as an initial assessment to conduct screening assessment in order to determine a proper disposal site. (Author)

  7. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-01-01

    and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design

  8. Concept of 'bad death'

    Directory of Open Access Journals (Sweden)

    Marija Vučković

    2016-02-01

    Full Text Available Following previous research on the linguistic concept of а 'bad death' which lexical expression is the word family of the verb ginuti, I focus my attention in this paper on the relationship between language conceptualization of а 'bad death' and the representation of а 'bad death' in traditional and contemporary culture. Diachronically based language corpus makes possible to trace the changes of referential frame and use of verb ginuti and its derivatives. In the traditional culture а 'bad death' is marked in action code by irregular way of burial and beliefs in demons stemming from the 'impure dead'. In the paper I explore the degree of synonymy of the symbols of all three codes: verbal code, action code and code of beliefs. In the contemporary culture the lack of individual control and choice is considered to be the key element of the concept of a 'bad death'. This change of conceptual content manifests itself in the use of its lexical expressions.

  9. Linguistic coding deficits in foreign language learners.

    Science.gov (United States)

    Sparks, R; Ganschow, L; Pohlman, J

    1989-01-01

    As increasing numbers of colleges and universities require a foreign language for graduation in at least one of their degree programs, reports of students with difficulties in learning a second language are multiplying. Until recently, little research has been conducted to identify the nature of this problem. Recent attempts by the authors have focused upon subtle but ongoing language difficulties in these individuals as the source of their struggle to learn a foreign language. The present paper attempts to expand upon this concept by outlining a theoretical framework based upon a linguistic coding model that hypothesizes deficits in the processing of phonological, syntactic, and/or semantic information. Traditional psychoeducational assessment batteries of standardized intelligence and achievement tests generally are not sensitive to these linguistic coding deficits unless closely analyzed or, more often, used in conjunction with a more comprehensive language assessment battery. Students who have been waived from a foreign language requirement and their proposed type(s) of linguistic coding deficits are profiled. Tentative conclusions about the nature of these foreign language learning deficits are presented along with specific suggestions for tests to be used in psychoeducational evaluations.

  10. Nevada Administrative Code for Special Education Programs.

    Science.gov (United States)

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  11. Coded ultrasonic remote control without batteries

    International Nuclear Information System (INIS)

    Gerhardy, C; Burlage, K; Schomburg, W K

    2009-01-01

    A concept for battery-less remote controls has been developed based on mechanically actuated beams and micro whistles generating ultrasound signals. These signals need to be frequency or time coded to increase the number of signals which can be distinguished from each other and environmental ultrasound. Several designs for generating coded ultrasonic signals have been investigated

  12. Concepts of formal concept analysis

    Science.gov (United States)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  13. DNA: Polymer and molecular code

    Science.gov (United States)

    Shivashankar, G. V.

    1999-10-01

    gene expression a prime example of a biological code. We developed a novel method of making DNA micro- arrays, the so-called DNA chip. Using the optical tweezer concept, we were able to pattern biomolecules on a solid substrate, developing a new type of sub-micron laser lithography. A laser beam is focused onto a thin gold film on a glass substrate. Laser ablation of gold results in local aggregation of nanometer scale beads conjugated with small DNA oligonucleotides, with sub-micron resolution. This leads to specific detection of cDNA and RNA molecules. We built a simple micro-array fabrication and detection in the laboratory, based on this method, to probe addressable pools (genes, proteins or antibodies). We have lately used molecular beacons (single stranded DNA with a stem-loop structure containing a fluorophore and quencher), for the direct detection of unlabelled mRNA. As a first step towards a study of the dynamics of the biological code, we have begun to examine the patterns of gene expression during virus (T7 phage) infection of E-coli bacteria.

  14. Analysis specifications for the CC3 biosphere model BIOTRAC

    International Nuclear Information System (INIS)

    Szekely, J.G.; Wojciechowski, L.C.; Stephens, M.E.; Halliday, H.A.

    1994-12-01

    AECL Research is assessing a concept for disposing of Canada's nuclear fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system to take into account parameter variation. For the postclosure assessment, the system model, CC3 (Canadian Concept, generation 3), was developed to describe a hypothetical disposal system that includes a disposal vault, the local geosphere and the biosphere in the vicinity of any discharge zones. BIOTRAC (BIOsphere TRansport And Consequences) is the biosphere model in the CC3 system model. The specifications for BIOTRAC, which were developed over a period of seven years, were subjected to numerous walkthrough examinations by the Biosphere Model Working Group to ensure that the intent of the model developers would be correctly specified for transformation into FORTRAN code. The FORTRAN version of BIOTRAC was written from interim versions of these specifications. Improvements to the code are based on revised versions of these specifications. The specifications consist of a data dictionary; sets of synopses, data flow diagrams and mini specs for the component models of BIOTRAC (surface water, soil, atmosphere, and food chain and dose); and supporting calculations (interface to the geosphere, consequences, and mass balance). (author). 20 refs., tabs., figs

  15. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  16. EOSCUBE: A Constraint Database System for High-Level Specification and Efficient Generation of EOSDIS Products. Phase 1; Proof-of-Concept

    Science.gov (United States)

    Brodsky, Alexander; Segal, Victor E.

    1999-01-01

    The EOSCUBE constraint database system is designed to be a software productivity tool for high-level specification and efficient generation of EOSDIS and other scientific products. These products are typically derived from large volumes of multidimensional data which are collected via a range of scientific instruments.

  17. CONCEPT-5 user's manual

    International Nuclear Information System (INIS)

    Hudson, C.R. II.

    1979-01-01

    The CONCEPT computer code package was developed to provide conceptual capital cost estimates for nuclear-fueled and fossil-fired power plants. Cost estimates can be made as a function of plant type, size, location, and date of initial operation. The output includes a detailed breakdown of the estimate into direct and indirect costs similar to the accounting system described in document NUS--531. Cost models are currently provided in CONCEPT 5 for single- and multiunit pressurized-water reactors, boiling-water reactors, and cost-fired plants with and without flue gas desulfurization equipment

  18. KALIMER design concept report

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Kyu; Kim, Young Cheol; Kim, Young In; Kim, Young Gyun; Kim, Eui Kwang; Song, Hoon; Chung, Hyun Tai; Hwang, Woan; Nam, Cheol; Sub, Sim Yoon; Kim, Yeon Sik; Whan, Wim Myung; Min, Byung Tae; Yoo, Bong; Lee, Jae Han; Lee, Hyeong Yeon; Kim, Jong Bum; Koo, Gyeong Hoi; Ham, Chang Shik; Kwon, Kee Choon; Kim, Jung Taek; Park, Jae Chang; Lee, Jung Woon; Lee, Yong Hee; Kim, Chang Hwoi; Sim, Bong Shick; Hahn, Do Hee; Choi, Jong Hyeun; Kwon, Sang Woon

    1997-07-01

    KAERI is working for the development of KALIMER and work is being done for methodology development, experimental facility set up and design concept development. The development target of KALIMER has been set as to make KALIMER safer, more economic, more resistant to nuclear proliferation, and yield less impact on the environment. To achieve the target, study has been made for setting up the design concept of KALIMER including the assessment of various possible design alternatives. This report is the results of the study for the KALIMER concept study and describes the design concept of KALIMER. The developed design concept study and describes the design concept of KALIMER. The developed design concept is to be used as the starting point of the next development phase of conceptual design and the concept will be refined and modified in the conceptual design phase. The scope of the work has been set as the NSSS and essential BOP systems. For systems, NSSS and functionally related major BOP are covered. Sizing and specifying conceptual structure are covered for major equipment. Equipment and piping are arranged for the parts where the arrangement is critical in fulfilling the foresaid intention of setting up the KALIMER design concept. This report consists of 10 chapters. Chapter 2 is for the top level design requirements of KALIMER and it serves as the basis of KALIMER design concept development. Chapter 3 summarizes the KALIMER concept and describes the general design features. The remaining chapters are for specific systems. (author). 29 tabs., 37 figs.

  19. KALIMER design concept report

    International Nuclear Information System (INIS)

    Park, Chang Kyu; Kim, Young Cheol; Kim, Young In; Kim, Young Gyun; Kim, Eui Kwang; Song, Hoon; Chung, Hyun Tai; Hwang, Woan; Nam, Cheol; Sim Yoon Sub; Kim, Yeon Sik; Wim Myung Whan; Min, Byung Tae; Yoo, Bong; Lee, Jae Han; Lee, Hyeong Yeon; Kim, Jong Bum; Koo, Gyeong Hoi; Ham, Chang Shik; Kwon, Kee Choon; Kim, Jung Taek; Park, Jae Chang; Lee, Jung Woon; Lee, Yong Hee; Kim, Chang Hwoi; Sim, Bong Shick; Hahn, Do Hee; Choi, Jong Hyeun; Kwon, Sang Woon.

    1997-07-01

    KAERI is working for the development of KALIMER and work is being done for methodology development, experimental facility set up and design concept development. The development target of KALIMER has been set as to make KALIMER safer, more economic, more resistant to nuclear proliferation, and yield less impact on the environment. To achieve the target, study has been made for setting up the design concept of KALIMER including the assessment of various possible design alternatives. This report is the results of the study for the KALIMER concept study and describes the design concept of KALIMER. The developed design concept study and describes the design concept of KALIMER. The developed design concept is to be used as the starting point of the next development phase of conceptual design and the concept will be refined and modified in the conceptual design phase. The scope of the work has been set as the NSSS and essential BOP systems. For systems, NSSS and functionally related major BOP are covered. Sizing and specifying conceptual structure are covered for major equipment. Equipment and piping are arranged for the parts where the arrangement is critical in fulfilling the foresaid intention of setting up the KALIMER design concept. This report consists of 10 chapters. Chapter 2 is for the top level design requirements of KALIMER and it serves as the basis of KALIMER design concept development. Chapter 3 summarizes the KALIMER concept and describes the general design features. The remaining chapters are for specific systems. (author). 29 tabs., 37 figs

  20. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  1. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  2. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  3. NEW DATA ON THE SAFETY OF NONSTEROIDAL ANTI-INFLAMMATORY DRUGS: THE CONCEPT OF THE HIGH CLASS-SPECIFIC CARDIOVASCULAR RISK OF SELECTIVE CYCLOOXYGENASE-2 INHIBITORS IS OUTDATED

    Directory of Open Access Journals (Sweden)

    A. E. Karateev

    2017-01-01

    Full Text Available The results of the PRECISION trial were published in late 2016. During this trial, a total of 24,081 patients at high cardiovascular risk took celecoxib 200-400 mg/day, naproxen 750–1000 mg/day or ibuprofen 1800–2400 mg/day for more than 1.5 years (20.3±16.0 months. The findings show that the frequency of vascular catastrophes (death, nonfatal myocardial infarction, and stroke in patients receiving celecoxib was not higher than that of the similar complications in those taking the control drugs. At the same time, celecoxib demonstrated a statistically significant advantage in reducing the risk of serious gastrointestinal complications. New evidence refutes the concept of high cardiovascular risk that is common to all coxibs and confirms the provisions of national guidelines for the rational use of nonsteroidal anti-inflammatory drugs (NSAIDs, which were published in 2015. This review presents recent data on the risk of NSAID-related complications, including a brief description of the design and results of the PRECISION trial. 

  4. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    output includes a plot of the MAAP calculation and the plant data. For the large integral experiments, a major part, but not all of the MAAP code is needed. These use an experiment specific benchmark routine that includes all of the information and boundary conditions for performing the calculation, as well as the information of which parts of MAAP are unnecessary and can be 'bypassed'. Lastly, the separate effects tests only require a few MAAP routines. These are exercised through their own specific benchmark routine that includes the experiment specific information and boundary conditions. This benchmark routine calls the appropriate MAAP routines from the source code, performs the calculations, including integration where necessary and provide the comparison between the MAAP calculation and the experimental observations. (author)

  5. Structured Review of Code Clone Literature

    NARCIS (Netherlands)

    Hordijk, W.T.B.; Ponisio, Laura; Wieringa, Roelf J.

    2008-01-01

    This report presents the results of a structured review of code clone literature. The aim of the review is to assemble a conceptual model of clone-related concepts which helps us to reason about clones. This conceptual model unifies clone concepts from a wide range of literature, so that findings

  6. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  7. To Go or Not to Go: A Proof of Concept Study Testing Food-Specific Inhibition Training for Women with Eating and Weight Disorders.

    Science.gov (United States)

    Turton, Robert; Nazar, Bruno P; Burgess, Emilee E; Lawrence, Natalia S; Cardi, Valentina; Treasure, Janet; Hirsch, Colette R

    2018-01-01

    Inefficient food-specific inhibitory control is a potential mechanism that underlies binge eating in bulimia nervosa and binge eating disorder. Go/no-go training tools have been developed to increase inhibitory control over eating impulses. Using a within-subjects design, this study examined whether one session of food-specific go/no-go training, versus general inhibitory control training, modifies eating behaviour. The primary outcome measure was food consumption on a taste test following each training session. Women with bulimia nervosa and binge eating disorder had small non-significant reductions in high-calorie food consumption on the taste test following the food-specific compared with the general training. There were no effects on eating disorder symptomatic behaviour (i.e. binge eating/purging) in the 24 h post-training. The training task was found to be acceptable by the clinical groups. More research is needed with larger sample sizes to determine the effectiveness of this training approach for clinical populations. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  8. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...

  9. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  10. Energy information data base: report number codes

    International Nuclear Information System (INIS)

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used

  11. A proof-of-concept model for the identification of the key events in the infection process with specific reference to Pseudomonas aeruginosa in corneal infections

    Directory of Open Access Journals (Sweden)

    Ilias Soumpasis

    2015-11-01

    Full Text Available Background: It is a common medical practice to characterise an infection based on the causative agent and to adopt therapeutic and prevention strategies targeting the agent itself. However, from an epidemiological perspective, exposure to a microbe can be harmless to a host as a result of low-level exposure or due to host immune response, with opportunistic infection only occurring as a result of changes in the host, pathogen, or surrounding environment. Methods: We have attempted to review systematically the key host, pathogen, and environmental factors that may significantly impact clinical outcomes of exposure to a pathogen, using Pseudomonas aeruginosa eye infection as a case study. Results and discussion: Extended contact lens wearing and compromised hygiene may predispose users to microbial keratitis, which can be a severe and vision-threatening infection. P. aeruginosa has a wide array of virulence-associated genes and sensing systems to initiate and maintain cell populations at the corneal surface and beyond. We have adapted the well-known concept of the epidemiological triangle in combination with the classic risk assessment framework (hazard identification, characterisation, and exposure to develop a conceptual pathway-based model that demonstrates the overlapping relationships between the host, the pathogen, and the environment; and to illustrate the key events in P. aeruginosa eye infection. Conclusion: This strategy differs from traditional approaches that consider potential risk factors in isolation, and hopefully will aid the identification of data and models to inform preventive and therapeutic measures in addition to risk assessment. Furthermore, this may facilitate the identification of knowledge gaps to direct research in areas of greatest impact to avert or mitigate adverse outcomes of infection.

  12. Proof of Concept Study for the Design, Manufacturing, and Testing of a Patient-Specific Shape Memory Device for Treatment of Unicoronal Craniosynostosis.

    Science.gov (United States)

    Borghi, Alessandro; Rodgers, Will; Schievano, Silvia; Ponniah, Allan; Jeelani, Owase; Dunaway, David

    2018-01-01

    Treatment of unicoronal craniosynostosis is a surgically challenging problem, due to the involvement of coronal suture and cranial base, with complex asymmetries of the calvarium and orbit. Several techniques for correction have been described, including surgical bony remodeling, early strip craniotomy with orthotic helmet remodeling and distraction. Current distraction devices provide unidirectional forces and have had very limited success. Nitinol is a shape memory alloy that can be programmed to the shape of a patient-specific anatomy by means of thermal treatment.In this work, a methodology to produce a nitinol patient-specific distractor is presented: computer tomography images of a 16-month-old patient with unicoronal craniosynostosis were processed to create a 3-dimensional model of his skull and define the ideal shape postsurgery. A mesh was produced from a nitinol sheet, formed to the ideal skull shape and heat treated to be malleable at room temperature. The mesh was afterward deformed to be attached to a rapid prototyped plastic skull, replica of the patient initial anatomy. The mesh/skull construct was placed in hot water to activate the mesh shape memory property: the deformed plastic skull was computed tomography scanned for comparison of its shape with the initial anatomy and with the desired shape, showing that the nitinol mesh had been able to distract the plastic skull to a shape close to the desired one.The shape-memory properties of nitinol allow for the design and production of patient-specific devices able to deliver complex, preprogrammable shape changes.

  13. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  14. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  15. Effects of a food-specific inhibition training in individuals with binge eating disorder-findings from a randomized controlled proof-of-concept study.

    Science.gov (United States)

    Giel, Katrin Elisabeth; Speer, Eva; Schag, Kathrin; Leehr, Elisabeth Johanna; Zipfel, Stephan

    2017-06-01

    Impulsivity might contribute to the development and maintenance of obesity and eating disorders. Patients suffering from binge eating disorder (BED) show an impulsive eating pattern characterized by regular binge eating episodes. Novel behavioral interventions increasing inhibitory control could improve eating behavior in BED. We piloted a novel food-specific inhibition training in individuals with BED. N = 22 BED patients according to SCID-I were randomly assigned to three sessions of a training or control condition. In both conditions, pictures of high-caloric food items were presented in peripheral vision on a computer screen while assessing gaze behavior. The training group had to suppress the urge to turn their gaze towards these pictures (i.e., to perform antisaccades). The control group was allowed to freely explore the pictures. We assessed self-reported food craving, food addiction, and wanting/liking of food pictures pre- and post-intervention. Twenty participants completed the study. The training proved to be feasible and acceptable. Patients of the training group significantly improved inhibitory control towards high-caloric food stimuli. Both groups reported a significantly lower number of binge eating episodes in the last four weeks after termination of the study. No changes were found in food craving, food addiction, liking, and wanting ratings. A food-specific inhibition training could be a useful element in the treatment of BED and other eating disorders; however, larger efficacy studies in patient samples are needed to investigate the efficacy of this and similar training approaches.

  16. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream

    Science.gov (United States)

    Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY

    2018-01-01

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853

  17. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream.

    Science.gov (United States)

    Martin, Chris B; Douglas, Danielle; Newsome, Rachel N; Man, Louisa Ly; Barense, Morgan D

    2018-02-02

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. © 2018, Martin et al.

  18. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  19. {sup 99m}Tc-N.T.P. 15-5 imaging for the early and specific diagnosis of chondrosarcoma: proof of concept in rats

    Energy Technology Data Exchange (ETDEWEB)

    Miot-Noirault, E.; Vidal, A.; Rapp, M.; Madelmont, J.C.; Maublant, J.; Moins, N. [Institut National de la Sante et de la Recherche Medicale (INSERM), UMR 484, 63 - Clermont Ferrand (France); Redini, F.; Gouin, F.; Heymann, D. [Institut National de la Sante et de la Recherche Medicale (INSERM), ERI 7, EA 3822, 44 - Nantes (France)

    2008-02-15

    Aim. - The U.M.R. 484 I.N.S.E.R.M. develops a 'cartilage imaging strategy' with the {sup 99m}Tc-N.T.P. 15-5 tracer that selectively binds to cartilage proteoglycans, allowing a highly specific cartilage imaging. Since chondro-genic tumours are characterized by the presence of cartilaginous matrix, we hypothesized that the {sup 99m}Tc-N.T.P. 15-5 tracer would allow chondrosarcoma imaging, which is currently lacking in clinics. In the rat model of grade II para-tibial chondrosarcoma, we evaluated the relevance of {sup 99m}Tc-N.T.P. 15-5 imaging for an early and specific diagnosis of chondrosarcoma. Methods. - {sup 99m}Tc-N.T.P. 15-5 longitudinal imaging of animals was performed during two months after para-tibial ortho-topic tumour implantation in the right paw, the left being used as control. At regular intervals, animals were submitted to {sup 99m}Tc-M.D.P. bone imaging, the only examination used for SPECT diagnosis of chondrosarcoma in patients. Tumour volume was monitored for two months when the tumours became palpable, with the two perpendicular diameters measured. For both cartilage and bone imaging, the scans were considered positive when areas of tracer uptake were present at sites consistent with the sites of inoculation. For each animal, positive scans were analyzed at each stage using the semiquantitative method of the target to background ratio (T.B.R.), with the target R.O.I. being delineated over the tumour and background R.O.I. over vertebra and muscle. T.B.R. time-course was followed as a function of tumour growth.At study ending, each animal was sacrificed for histopathological control. Results. - {sup 99m}Tc-N.T.P. 15-5 scans were positive in 100% of the animals at very early stage (three days) after implantation, while no palpable nor measurable tumour could be assessed. Quantitative analysis of {sup 99m}Tc-N.T.P. 15-5 scans evidenced a significant uptake of the tracer at the implantation site at early stage. The time-course of T

  20. Use of electromyography to optimize Lokomat® settings for subject-specific gait rehabilitation in post-stroke hemiparetic patients: A proof-of-concept study.

    Science.gov (United States)

    Cherni, Yosra; Begon, Mickael; Chababe, Hicham; Moissenet, Florent

    2017-09-01

    While generic protocols exist for gait rehabilitation using robotic orthotics such as the Lokomat ® , several settings - guidance, body-weight support (BWS) and velocity - may be adjusted to individualize patient training. However, no systematic approach has yet emerged. Our objective was to assess the feasibility and effects of a systematic approach based on electromyography to determine subject-specific settings with application to the strengthening of the gluteus maximus muscle in post-stroke hemiparetic patients. Two male patients (61 and 65 years) with post-stroke hemiparesis performed up to 9 Lokomat ® trials by changing guidance and BWS while electromyography of the gluteus maximus was measured. For each subject, the settings that maximized gluteus maximus activity were used in 20 sessions of Lokomat ® training. Modified Functional Ambulation Classification (mFAC), 6-minutes walking test (6-MWT), and extensor strength were measured before and after training. The greatest gluteus maximus activity was observed at (Guidance: 70% -BWS: 20%) for Patient 1 and (Guidance: 80% - BWS: 30%) for Patient 2. In both patients, mFAC score increased from 4 to 7. The additional distance in 6-MWT increased beyond minimal clinically important difference (MCID=34.4m) reported for post-stroke patients. The isometric strength of hip extensors increased by 43 and 114%. Defining subject-specific settings for a Lokomat ® training was feasible and simple to implement. These two case reports suggest a benefit of this approach for muscle strengthening. It remains to demonstrate the superiority of such an approach for a wider population, compared to the use of a generic protocol. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  1. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  2. Mathematical concepts

    CERN Document Server

    Jost, Jürgen

    2015-01-01

    The main intention of this book is to describe and develop the conceptual, structural and abstract thinking of mathematics. Specific mathematical structures are used to illustrate the conceptual approach; providing a deeper insight into mutual relationships and abstract common features. These ideas are carefully motivated, explained and illustrated by examples so that many of the more technical proofs can be omitted. The book can therefore be used: ·         simply as an overview of the panorama of mathematical structures and the relations between them, to be supplemented by more detailed texts whenever you want to acquire a working knowledge of some structure ·         by itself as a first introduction to abstract mathematics ·         together with existing textbooks, to put their results into a more general perspective ·         to gain a new and hopefully deeper perspective after having studied such textbooks Mathematical Concepts has a broader scope and is less detaile...

  3. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  4. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  5. Non-Coding Transcript Heterogeneity in Mesothelioma: Insights from Asbestos-Exposed Mice.

    Science.gov (United States)

    Felley-Bosco, Emanuela; Rehrauer, Hubert

    2018-04-11

    Mesothelioma is an aggressive, rapidly fatal cancer and a better understanding of its molecular heterogeneity may help with making more efficient therapeutic strategies. Non-coding RNAs represent a larger part of the transcriptome but their contribution to diseases is not fully understood yet. We used recently obtained RNA-seq data from asbestos-exposed mice and performed data mining of publicly available datasets in order to evaluate how non-coding RNA contribute to mesothelioma heterogeneity. Nine non-coding RNAs are specifically elevated in mesothelioma tumors and contribute to human mesothelioma heterogeneity. Because some of them have known oncogenic properties, this study supports the concept of non-coding RNAs as cancer progenitor genes.

  6. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  7. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  8. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  9. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  10. Folic acid induces cell type-specific changes in the transcriptome of breast cancer cell lines: a proof-of-concept study.

    Science.gov (United States)

    Price, R Jordan; Lillycrop, Karen A; Burdge, Graham C

    2016-01-01

    The effect of folic acid (FA) on breast cancer (BC) risk is uncertain. We hypothesised that this uncertainty may be due, in part, to differential effects of FA between BC cells with different phenotypes. To test this we investigated the effect of treatment with FA concentrations within the range of unmetabolised FA reported in humans on the expression of the transcriptome of non-transformed (MCF10A) and cancerous (MCF7 and Hs578T) BC cells. The total number of transcripts altered was: MCF10A, seventy-five (seventy up-regulated); MCF7, twenty-four (fourteen up-regulated); and Hs578T, 328 (156 up-regulated). Only the cancer-associated gene TAGLN was altered by FA in all three cell lines. In MCF10A and Hs578T cells, FA treatment decreased pathways associated with apoptosis, cell death and senescence, but increased those associated with cell proliferation. The folate transporters SLC19A1, SLC46A1 and FOLR1 were differentially expressed between cell lines tested. However, the level of expression was not altered by FA treatment. These findings suggest that physiological concentrations of FA can induce cell type-specific changes in gene regulation in a manner that is consistent with proliferative phenotype. This has implications for understanding the role of FA in BC risk. In addition, these findings support the suggestion that differences in gene expression induced by FA may involve differential activities of folate transporters. Together these findings indicate the need for further studies of the effect of FA on BC.

  11. Concepts and theoretical specifications of a Coastal Vulnerability Dynamic Simulator (COVUDS): A multi-agent system for simulating coastal vulnerability towards management of coastal ecosystem services

    Science.gov (United States)

    Orencio, P. M.; Endo, A.; Taniguchi, M.

    2014-12-01

    Disaster-causing natural hazards such as floods, erosions, earthquakes or slope failures were particularly observed to be concentrated in certain geographical regions. In the Asia-pacific region, coastal ecosystems were suffering because of perennial threats driven by chronic fluctuations in climate variability (e.g., typhoons, ENSO), or by dynamically occurring events (e.g., earthquakes, tsunamis). Among the many people that were found prone to such a risky condition were the ones inhabiting near the coastal areas. Characteristically, aside from being located at the forefront of these events, the coastal communities have impacted the resource by the kind of behavioral patterns they exhibited, such as overdependence and overexploitation to achieve their wellbeing. In this paper, we introduce the development of an approach to an assessment of the coupled human- environment using a multi- agent simulation (MAS) model known as Coastal Vulnerability Dynamic Simulator (COVUDS). The COVUDS comprised a human- environmental platform consisting multi- agents with corresponding spatial- based dynamic and static variables. These variables were used to present multiple hypothetical future situations that contribute to the purpose of supporting a more rational management of the coastal ecosystem and their environmental equities. Initially, we present the theoretical and conceptual components that would lead to the development of the COVUDS. These consisted of the human population engaged in behavioral patterns affecting the conditions of coastal ecosystem services; the system of the biophysical environment and changes in patches brought by global environment and local behavioral variations; the policy factors that were important for choosing area- specific interventions; and the decision- making mechanism that integrates the first three components. To guide a future scenario-based application that will be undertaken in a coastal area in the Philippines, the components of the

  12. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  13. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  14. COMET concept

    International Nuclear Information System (INIS)

    Alsmeyer, H.; Tromm, W.

    1995-01-01

    Studies of the COMET core catcher concept developed for a future PWR have been continued. The concept is based on the spreading of a core melt on a sacrificial layer and its erosion, until a subsequent addition of water from below causes a fragmentation of the melt. A porous solidification of the melt would then admit a complete flooding within a short period. (orig.)

  15. Concept Mapping

    Science.gov (United States)

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  16. Management concepts.

    Science.gov (United States)

    Bittner, Rhonda

    2006-01-01

    Management concepts evolve through time. Health care managers can learn new concepts by evaluating classical management strategies, as well as modern-day strategies. Focusing on quality improvement and team building can help managers align the goals of their departments with the goals of the organization, consequently improving patient care.

  17. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  18. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  19. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  20. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  1. Concept theory

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2009-01-01

      Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...

  2. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  3. Coded communications with nonideal interleaving

    Science.gov (United States)

    Laufer, Shaul

    1991-02-01

    Burst error channels - a type of block interference channels - feature increasing capacity but decreasing cutoff rate as the memory rate increases. Despite the large capacity, there is degradation in the performance of practical coding schemes when the memory length is excessive. A short-coding error parameter (SCEP) was introduced, which expresses a bound on the average decoding-error probability for codes shorter than the block interference length. The performance of a coded slow frequency-hopping communication channel is analyzed for worst-case partial band jamming and nonideal interleaving, by deriving expressions for the capacity and cutoff rate. The capacity and cutoff rate, respectively, are shown to approach and depart from those of a memoryless channel corresponding to the transmission of a single code letter per hop. For multiaccess communications over a slot-synchronized collision channel without feedback, the channel was considered as a block interference channel with memory length equal to the number of letters transmitted in each slot. The effects of an asymmetrical background noise and a reduced collision error rate were studied, as aspects of real communications. The performance of specific convolutional and Reed-Solomon codes was examined for slow frequency-hopping systems with nonideal interleaving. An upper bound is presented for the performance of a Viterbi decoder for a convolutional code with nonideal interleaving, and a soft decision diversity combining technique is introduced.

  4. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  5. Error-Rate Bounds for Coded PPM on a Poisson Channel

    Science.gov (United States)

    Moision, Bruce; Hamkins, Jon

    2009-01-01

    Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.

  6. Data governance implementation concept

    OpenAIRE

    Ullrichová, Jana

    2016-01-01

    This master´s thesis discusses concept of implementation for data governance. The theoretical part of this thesis is about data governance. It explains why data are important for company, describes definitoons of data governance, its history, its components, its principles and processes and fitting in company. Theoretical part is amended with examples of data governance failures and banking specifics. The main goal of this thesis is to create a concept for implementing data governance and its...

  7. QUIL: a chemical equilibrium code

    International Nuclear Information System (INIS)

    Lunsford, J.L.

    1977-02-01

    A chemical equilibrium code QUIL is described, along with two support codes FENG and SURF. QUIL is designed to allow calculations on a wide range of chemical environments, which may include surface phases. QUIL was written specifically to calculate distributions associated with complex equilibria involving fission products in the primary coolant loop of the high-temperature gas-cooled reactor. QUIL depends upon an energy-data library called ELIB. This library is maintained by FENG and SURF. FENG enters into the library all reactions having standard free energies of reaction that are independent of concentration. SURF enters all surface reactions into ELIB. All three codes are interactive codes written to be used from a remote terminal, with paging control provided. Plotted output is also available

  8. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  9. Cost reducing code implementation strategies

    International Nuclear Information System (INIS)

    Kurtz, Randall L.; Griswold, Michael E.; Jones, Gary C.; Daley, Thomas J.

    1995-01-01

    Sargent and Lundy's Code consulting experience reveals a wide variety of approaches toward implementing the requirements of various nuclear Codes Standards. This paper will describe various Code implementation strategies which assure that Code requirements are fully met in a practical and cost-effective manner. Applications to be discussed includes the following: new construction; repair, replacement and modifications; assessments and life extensions. Lessons learned and illustrative examples will be included. Preferred strategies and specific recommendations will also be addressed. Sargent and Lundy appreciates the opportunity provided by the Korea Atomic Industrial Forum and Korean Nuclear Society to share our ideas and enhance global cooperation through the exchange of information and views on relevant topics

  10. Circular codes revisited: a statistical approach.

    Science.gov (United States)

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Clinical reasoning: concept analysis.

    Science.gov (United States)

    Simmons, Barbara

    2010-05-01

    This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.

  12. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  13. ITER concept definition. V.2

    International Nuclear Information System (INIS)

    1989-01-01

    Volume II of the two volumes describing the concept definition of the International Thermonuclear Experimental Reactor deals with the ITER concept in technical depth, and covers all areas of design of the ITER tokamak. Included are an assessment of the current database for design, scoping studies, rationale for concepts selection, performance flexibility, the ITER concept, the operations and experimental/testing program, ITER parameters and design phase schedule, and research and development specific to ITER. This latter includes a definition of specific research and development tasks, a division of tasks among members, specific milestones, required results, and schedules. Figs and tabs

  14. The code of ethics for nurses.

    Science.gov (United States)

    Zahedi, F; Sanjari, M; Aala, M; Peymani, M; Aramesh, K; Parsapour, A; Maddah, Ss Bagher; Cheraghi, Ma; Mirzabeigi, Gh; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Nurses are ever-increasingly confronted with complex concerns in their practice. Codes of ethics are fundamental guidance for nursing as many other professions. Although there are authentic international codes of ethics for nurses, the national code would be the additional assistance provided for clinical nurses in their complex roles in care of patients, education, research and management of some parts of health care system in the country. A national code can provide nurses with culturally-adapted guidance and help them to make ethical decisions more closely to the Iranian-Islamic background. Given the general acknowledgement of the need, the National Code of Ethics for Nurses was compiled as a joint project (2009-2011). The Code was approved by the Health Policy Council of the Ministry of Health and Medical Education and communicated to all universities, healthcare centers, hospitals and research centers early in 2011. The focus of this article is on the course of action through which the Code was compiled, amended and approved. The main concepts of the code will be also presented here. No doubt, development of the codes should be considered as an ongoing process. This is an overall responsibility to keep the codes current, updated with the new progresses of science and emerging challenges, and pertinent to the nursing practice.

  15. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  16. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  17. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  18. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  19. Travelling Concepts

    DEFF Research Database (Denmark)

    Simonsen, Karen-Margrethe

    2013-01-01

    Review of "Travelling Concepts, Metaphors, and Narratives: Literary and Cultural Studies in an Age of Interdisciplinary Research" ed. by Sibylle Baumgarten, Beatrice Michaelis and Ansagar Nünning, Trier; Wissenschaftlicher Verlag Trier, 2012......Review of "Travelling Concepts, Metaphors, and Narratives: Literary and Cultural Studies in an Age of Interdisciplinary Research" ed. by Sibylle Baumgarten, Beatrice Michaelis and Ansagar Nünning, Trier; Wissenschaftlicher Verlag Trier, 2012...

  20. Development of SAGE, A computer code for safety assessment analyses for Korean Low-Level Radioactive Waste Disposal

    International Nuclear Information System (INIS)

    Zhou, W.; Kozak, Matthew W.; Park, Joowan; Kim, Changlak; Kang, Chulhyung

    2002-01-01

    This paper describes a computer code, called SAGE (Safety Assessment Groundwater Evaluation) to be used for evaluation of the concept for low-level waste disposal in the Republic of Korea (ROK). The conceptual model in the code is focused on releases from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. Doses can be calculated for several biosphere systems including drinking contaminated groundwater, and subsequent contamination of foods, rivers, lakes, or the ocean by that groundwater. The flexibility of the code will permit both generic analyses in support of design and site development activities, and straightforward modification to permit site-specific and design-specific safety assessments of a real facility as progress is made toward implementation of a disposal site. In addition, the code has been written to easily interface with more detailed codes for specific parts of the safety assessment. In this way, the code's capabilities can be significantly expanded as needed. The code has the capability to treat input parameters either deterministic ally or probabilistic ally. Parameter input is achieved through a user-friendly Graphical User Interface.

  1. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  2. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    Science.gov (United States)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  3. Promoting Transfer of Ecosystems Concepts

    Science.gov (United States)

    Yu, Yawen; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Eberbach, Catherine; Sinha, Suparna

    2016-01-01

    This study examines to what extent students transferred their knowledge from a familiar aquatic ecosystem to an unfamiliar rainforest ecosystem after participating in a technology-rich inquiry curriculum. We coded students' drawings for components of important ecosystems concepts at pre- and posttest. Our analysis examined the extent to which each…

  4. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  5. OpenDMAP: An open source, ontology-driven concept analysis engine, with applications to capturing knowledge regarding protein transport, protein interactions and cell-type-specific gene expression

    Directory of Open Access Journals (Sweden)

    Johnson Helen L

    2008-01-01

    Full Text Available Abstract Background Information extraction (IE efforts are widely acknowledged to be important in harnessing the rapid advance of biomedical knowledge, particularly in areas where important factual information is published in a diverse literature. Here we report on the design, implementation and several evaluations of OpenDMAP, an ontology-driven, integrated concept analysis system. It significantly advances the state of the art in information extraction by leveraging knowledge in ontological resources, integrating diverse text processing applications, and using an expanded pattern language that allows the mixing of syntactic and semantic elements and variable ordering. Results OpenDMAP information extraction systems were produced for extracting protein transport assertions (transport, protein-protein interaction assertions (interaction and assertions that a gene is expressed in a cell type (expression. Evaluations were performed on each system, resulting in F-scores ranging from .26 – .72 (precision .39 – .85, recall .16 – .85. Additionally, each of these systems was run over all abstracts in MEDLINE, producing a total of 72,460 transport instances, 265,795 interaction instances and 176,153 expression instances. Conclusion OpenDMAP advances the performance standards for extracting protein-protein interaction predications from the full texts of biomedical research articles. Furthermore, this level of performance appears to generalize to other information extraction tasks, including extracting information about predicates of more than two arguments. The output of the information extraction system is always constructed from elements of an ontology, ensuring that the knowledge representation is grounded with respect to a carefully constructed model of reality. The results of these efforts can be used to increase the efficiency of manual curation efforts and to provide additional features in systems that integrate multiple sources for

  6. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  7. Space Mission Concept Development Using Concept Maturity Levels

    Science.gov (United States)

    Wessen, Randii R.; Borden, Chester; Ziemer, John; Kwok, Johnny

    2013-01-01

    Over the past five years, pre-project formulation experts at the Jet Propulsion Laboratory (JPL) has developed and implemented a method for measuring and communicating the maturity of space mission concepts. Mission concept development teams use this method, and associated tools, prior to concepts entering their Formulation Phases (Phase A/B). The organizing structure is Concept Maturity Level (CML), which is a classification system for characterizing the various levels of a concept's maturity. The key strength of CMLs is the ability to evolve mission concepts guided by an incremental set of assessment needs. The CML definitions have been expanded into a matrix form to identify the breadth and depth of analysis needed for a concept to reach a specific level of maturity. This matrix enables improved assessment and communication by addressing the fundamental dimensions (e.g., science objectives, mission design, technical risk, project organization, cost, export compliance, etc.) associated with mission concept evolution. JPL's collaborative engineering, dedicated concept development, and proposal teams all use these and other CML-appropriate design tools to advance their mission concept designs. This paper focuses on mission concept's early Pre-Phase A represented by CMLs 1- 4. The scope was limited due to the fact that CMLs 5 and 6 are already well defined based on the requirements documented in specific Announcement of Opportunities (AO) and Concept Study Report (CSR) guidelines, respectively, for competitive missions; and by NASA's Procedural Requirements NPR 7120.5E document for Projects in their Formulation Phase.

  8. Deriving consumer-facing disease concepts for family health histories using multi-source sampling.

    Science.gov (United States)

    Hulse, Nathan C; Wood, Grant M; Haug, Peter J; Williams, Marc S

    2010-10-01

    The family health history has long been recognized as an effective way of understanding individuals' susceptibility to familial disease; yet electronic tools to support the capture and use of these data have been characterized as inadequate. As part of an ongoing effort to build patient-facing tools for entering detailed family health histories, we have compiled a set of concepts specific to familial disease using multi-source sampling. These concepts were abstracted by analyzing family health history data patterns in our enterprise data warehouse, collection patterns of consumer personal health records, analyses from the local state health department, a healthcare data dictionary, and concepts derived from genetic-oriented consumer education materials. Collectively, these sources yielded a set of more than 500 unique disease concepts, represented by more than 2500 synonyms for supporting patients in entering coded family health histories. We expect that these concepts will be useful in providing meaningful data and education resources for patients and providers alike.

  9. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  10. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  11. A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings.

    Science.gov (United States)

    Zhang, Yinsheng; Zhang, Guoming

    2018-01-01

    A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.

  12. A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings

    Directory of Open Access Journals (Sweden)

    Yinsheng Zhang

    2018-01-01

    Full Text Available A terminology (or coding system is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity. Based on a collection of historical ROP patients’ data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system—ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.

  13. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas

    2011-08-01

    Fixed-rate and rateless channel codes are generally treated separately in the related research literature and so, a novice in the field inevitably gets the impression that these channel codes are unrelated. By contrast, in this treatise, we endeavor to further develop a link between the traditional fixed-rate codes and the recently developed rateless codes by delving into their underlying attributes. This joint treatment is beneficial for two principal reasons. First, it facilitates the task of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well-understood codedesign tools-originally contrived for fixed-rate codes-to the realm of rateless codes. Indeed, these versatile tools proved to be vital in the design of diverse fixed-rate-coded communications systems, and thus our hope is that they will further elucidate the associated performance ramifications of the rateless coded schemes. © 2011 IEEE.

  14. Mobile code security

    Science.gov (United States)

    Ramalingam, Srikumar

    2001-11-01

    A highly secure mobile agent system is very important for a mobile computing environment. The security issues in mobile agent system comprise protecting mobile hosts from malicious agents, protecting agents from other malicious agents, protecting hosts from other malicious hosts and protecting agents from malicious hosts. Using traditional security mechanisms the first three security problems can be solved. Apart from using trusted hardware, very few approaches exist to protect mobile code from malicious hosts. Some of the approaches to solve this problem are the use of trusted computing, computing with encrypted function, steganography, cryptographic traces, Seal Calculas, etc. This paper focuses on the simulation of some of these existing techniques in the designed mobile language. Some new approaches to solve malicious network problem and agent tampering problem are developed using public key encryption system and steganographic concepts. The approaches are based on encrypting and hiding the partial solutions of the mobile agents. The partial results are stored and the address of the storage is destroyed as the agent moves from one host to another host. This allows only the originator to make use of the partial results. Through these approaches some of the existing problems are solved.

  15. Physical Layer Network Coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Yomo, Hironori; Popovski, Petar

    2013-01-01

    of interfering nodes and usage of spatial reservation mechanisms. Specifically, we introduce a reserved area in order to protect the nodes involved in two-way relaying from the interference caused by neighboring nodes. We analytically derive the end-to-end rate achieved by PLNC considering the impact......Physical layer network coding (PLNC) has the potential to improve throughput of multi-hop networks. However, most of the works are focused on the simple, three-node model with two-way relaying, not taking into account the fact that there can be other neighboring nodes that can cause....../receive interference. The way to deal with this problem in distributed wireless networks is usage of MAC-layer mechanisms that make a spatial reservation of the shared wireless medium, similar to the well-known RTS/CTS in IEEE 802.11 wireless networks. In this paper, we investigate two-way relaying in presence...

  16. Hello Ruby adventures in coding

    CERN Document Server

    Liukas, Linda

    2015-01-01

    "Code is the 21st century literacy and the need for people to speak the ABCs of Programming is imminent." --Linda Liukas Meet Ruby--a small girl with a huge imagination. In Ruby's world anything is possible if you put your mind to it. When her dad asks her to find five hidden gems Ruby is determined to solve the puzzle with the help of her new friends, including the Wise Snow Leopard, the Friendly Foxes, and the Messy Robots. As Ruby stomps around her world kids will be introduced to the basic concepts behind coding and programming through storytelling. Learn how to break big problems into small problems, repeat tasks, look for patterns, create step-by-step plans, and think outside the box. With hands-on activities included in every chapter, future coders will be thrilled to put their own imaginations to work.

  17. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  18. Predictive coding in Agency Detection

    DEFF Research Database (Denmark)

    Andersen, Marc Malmdorf

    2017-01-01

    Agency detection is a central concept in the cognitive science of religion (CSR). Experimental studies, however, have so far failed to lend support to some of the most common predictions that follow from current theories on agency detection. In this article, I argue that predictive coding, a highly...... promising new framework for understanding perception and action, may solve pending theoretical inconsistencies in agency detection research, account for the puzzling experimental findings mentioned above, and provide hypotheses for future experimental testing. Predictive coding explains how the brain......, unbeknownst to consciousness, engages in sophisticated Bayesian statistics in an effort to constantly predict the hidden causes of sensory input. My fundamental argument is that most false positives in agency detection can be seen as the result of top-down interference in a Bayesian system generating high...

  19. Effective coding with VHDL principles and best practice

    CERN Document Server

    Jasinski, Ricardo

    2016-01-01

    A guide to applying software design principles and coding practices to VHDL to improve the readability, maintainability, and quality of VHDL code. This book addresses an often-neglected aspect of the creation of VHDL designs. A VHDL description is also source code, and VHDL designers can use the best practices of software development to write high-quality code and to organize it in a design. This book presents this unique set of skills, teaching VHDL designers of all experience levels how to apply the best design principles and coding practices from the software world to the world of hardware. The concepts introduced here will help readers write code that is easier to understand and more likely to be correct, with improved readability, maintainability, and overall quality. After a brief review of VHDL, the book presents fundamental design principles for writing code, discussing such topics as design, quality, architecture, modularity, abstraction, and hierarchy. Building on these concepts, the book then int...

  20. User's manual for the CC3 computer models of the concept for disposal of Canada's nuclear fuel waste

    International Nuclear Information System (INIS)

    Dougan, K.D.; Wojciechowski, L.C.

    1995-06-01

    Atomic Energy of Canada Limited (AECL) is assessing a concept for disposing of CANDU reactor fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept, and for environmental assessments of other systems. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system, which is represented by the CC3 (Canadian Concept, generation 3) models comprising a design-specific vault, a site-specific geosphere and a biosphere typical of the Canadian Shield. (author). 23 refs., 7 tabs., 21 figs

  1. Two Conceptions of Virtue

    Science.gov (United States)

    Hill, Thomas E., Jr.

    2013-01-01

    The general questions are: what is virtue and how can it be cultivated? The specific focus is on the conceptions of virtue in the works of Immanuel Kant and John Rawls. Kant regarded virtue as a good will that is also strong enough to resist contrary passions, impulses, and inclinations. Childhood training can prepare children for virtue, but…

  2. Basic concepts

    International Nuclear Information System (INIS)

    Dorner, B.

    1999-01-01

    The basic concepts of neutron scattering as a tool for studying the structure and the dynamics of condensed matter. Theoretical aspects are outlined, the two different cases of coherent and incoherent scattering are presented. The issue of resolution, coherence volume and the role of monochromators are also discussed. (K.A.)

  3. Simple Concepts

    Czech Academy of Sciences Publication Activity Database

    Materna, Pavel

    2013-01-01

    Roč. 28, č. 3 (2013), s. 295-319 ISSN 0353-5150 R&D Projects: GA ČR(CZ) GAP401/10/0792 Institutional support: RVO:67985955 Keywords : concept * constructions * set-theoretical paradigm Subject RIV: AA - Philosophy ; Religion

  4. Monte Carlo code for neutron radiography

    International Nuclear Information System (INIS)

    Milczarek, Jacek J.; Trzcinski, Andrzej; El-Ghany El Abd, Abd; Czachor, Andrzej

    2005-01-01

    The concise Monte Carlo code, MSX, for simulation of neutron radiography images of non-uniform objects is presented. The possibility of modeling the images of objects with continuous spatial distribution of specific isotopes is included. The code can be used for assessment of the scattered neutron component in neutron radiograms

  5. Monte Carlo code for neutron radiography

    Energy Technology Data Exchange (ETDEWEB)

    Milczarek, Jacek J. [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland)]. E-mail: jjmilcz@cyf.gov.pl; Trzcinski, Andrzej [Institute for Nuclear Studies, Swierk, 05-400 Otwock (Poland); El-Ghany El Abd, Abd [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland); Nuclear Research Center, PC 13759, Cairo (Egypt); Czachor, Andrzej [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland)

    2005-04-21

    The concise Monte Carlo code, MSX, for simulation of neutron radiography images of non-uniform objects is presented. The possibility of modeling the images of objects with continuous spatial distribution of specific isotopes is included. The code can be used for assessment of the scattered neutron component in neutron radiograms.

  6. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  7. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  8. SASSYS LMFBR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.

    1982-01-01

    The SASSYS code provides detailed steady-state and transient thermal-hydraulic analyses of the reactor core, inlet and outlet coolant plenums, primary and intermediate heat-removal systems, steam generators, and emergency shut-down heat removal systems in liquid-metal-cooled fast-breeder reactors (LMFBRs). The main purpose of the code is to analyze the consequences of failures in the shut-down heat-removal system and to determine whether this system can perform its mission adequately even with some of its components inoperable. The code is not plant-specific. It is intended for use with any LMFBR, using either a loop or a pool design, a once-through steam generator or an evaporator-superheater combination, and either a homogeneous core or a heterogeneous core with internal-blanket assemblies

  9. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  10. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  11. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  12. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  13. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  14. On tentative decommissioning cost analysis with specific authentic cost calculations with the application of the Omega code on a case linked to the Intermediate storage facility for spent fuel in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Vasko, Marek; Daniska, Vladimir; Ondra, Frantisek; Bezak, Peter; Kristofova, Kristina; Tatransky, Peter; Zachar, Matej [DECOM Slovakia, spol. s.r.o., J. Bottu 2, SK-917 01 Trnava (Slovakia); Lindskog, Staffan [Swedish Nuclear Power Inspectorate, Stockholm (Sweden)

    2007-03-15

    The presented report is focused on tentative calculations of basic decommissioning parameters such as costs, manpower and exposure of personnel for activities of older nuclear facility decommissioning in Sweden represented by Intermediate storage facility for spent fuel in Studsvik, by means of calculation code OMEGA. This report continuously follows up two previous projects, which described methodology of cost estimates of decommissioning with an emphasis to derive cost functions for alpha contaminated material and implementation of the advanced decommissioning costing methodology for Intermediate Storage facility for Spent Fuel in Studsvik. The main purpose of the presented study is to demonstrate the trial application of the advanced costing methodology using OMEGA code for Intermediate Storage Facility for Spent Fuel in Studsvik. Basic work packages presented in report are as follows: 1. Analysis and validation input data on Intermediate Storage Facility for Spent Fuel and assemble a database suitable for standardised decommissioning cost calculations including radiological parameters, 2. Proposal of range of decommissioning calculations and define an extent of decommissioning activities, 3. Defining waste management scenarios for particular material waste streams from Intermediate Storage Facility for Spent Fuel, 4. Developing standardised cost calculation structure applied for Intermediate Storage Facility for Spent Fuel decommissioning calculation and 5. Performing tentative decommissioning calculations for Intermediate Storage Facility for Spent Fuel by OMEGA code. Calculated parameters of decommissioning are presented in structure according to Proposed Standardized List of Items for Costing Purposes. All parameters are documented and summed up in both table and graphic forms in text and Annexes. The presented report documents availability and applicability of methodology for evaluation of costs and other parameters of decommissioning in a form implemented

  15. On tentative decommissioning cost analysis with specific authentic cost calculations with the application of the Omega code on a case linked to the Intermediate storage facility for spent fuel in Sweden

    International Nuclear Information System (INIS)

    Vasko, Marek; Daniska, Vladimir; Ondra, Frantisek; Bezak, Peter; Kristofova, Kristina; Tatransky, Peter; Zachar, Matej; Lindskog, Staffan

    2007-03-01

    The presented report is focused on tentative calculations of basic decommissioning parameters such as costs, manpower and exposure of personnel for activities of older nuclear facility decommissioning in Sweden represented by Intermediate storage facility for spent fuel in Studsvik, by means of calculation code OMEGA. This report continuously follows up two previous projects, which described methodology of cost estimates of decommissioning with an emphasis to derive cost functions for alpha contaminated material and implementation of the advanced decommissioning costing methodology for Intermediate Storage facility for Spent Fuel in Studsvik. The main purpose of the presented study is to demonstrate the trial application of the advanced costing methodology using OMEGA code for Intermediate Storage Facility for Spent Fuel in Studsvik. Basic work packages presented in report are as follows: 1. Analysis and validation input data on Intermediate Storage Facility for Spent Fuel and assemble a database suitable for standardised decommissioning cost calculations including radiological parameters, 2. Proposal of range of decommissioning calculations and define an extent of decommissioning activities, 3. Defining waste management scenarios for particular material waste streams from Intermediate Storage Facility for Spent Fuel, 4. Developing standardised cost calculation structure applied for Intermediate Storage Facility for Spent Fuel decommissioning calculation and 5. Performing tentative decommissioning calculations for Intermediate Storage Facility for Spent Fuel by OMEGA code. Calculated parameters of decommissioning are presented in structure according to Proposed Standardized List of Items for Costing Purposes. All parameters are documented and summed up in both table and graphic forms in text and Annexes. The presented report documents availability and applicability of methodology for evaluation of costs and other parameters of decommissioning in a form implemented

  16. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  17. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  18. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  19. Review of codes, standards, and regulations for natural gas locomotives.

    Science.gov (United States)

    2014-06-01

    This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...

  20. Coding theory and cryptography the essentials

    CERN Document Server

    Hankerson, DC; Leonard, DA; Phelps, KT; Rodger, CA; Wall, JR; Wall, J R

    2000-01-01

    Containing data on number theory, encryption schemes, and cyclic codes, this highly successful textbook, proven by the authors in a popular two-quarter course, presents coding theory, construction, encoding, and decoding of specific code families in an ""easy-to-use"" manner appropriate for students with only a basic background in mathematics offering revised and updated material on the Berlekamp-Massey decoding algorithm and convolutional codes. Introducing the mathematics as it is needed and providing exercises with solutions, this edition includes an extensive section on cryptography, desig

  1. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas; Yang, Yuli; Aï ssa, Sonia; Hanzo, Lajos

    2011-01-01

    of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well

  2. Radio frequency channel coding made easy

    CERN Document Server

    Faruque, Saleh

    2016-01-01

    This book introduces Radio Frequency Channel Coding to a broad audience. The author blends theory and practice to bring readers up-to-date in key concepts, underlying principles and practical applications of wireless communications. The presentation is designed to be easily accessible, minimizing mathematics and maximizing visuals.

  3. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  4. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Wren, D.J.; Popov, N.; Snell, V.G.

    2004-01-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  5. Introductory concepts

    International Nuclear Information System (INIS)

    Barnes, W.E.

    1983-01-01

    Physical theories are commonly classified as being either ''classical'' or ''modern''. The reasons for this distinction are both historical and substantive. Limited in the sophistication of their measuring instruments, early scientists proposed theories appropriate for the description of the simplest and most accessible physical phenomena, e.g., the trajectories of the planets. Because of the class of phenomena observed, certain beliefs came to underlie all classical theories with regard to the nature of time, space, matter, etc. For example, the idea was undisputed that an object has at all times both a definite position and velocity. Not until the interior of the atom and the nature of electromagnetic radiation were explored was it discovered that the concepts of classical physics are inadequate to deal with many phenomena. A reassessment of fundamental postulates led to the formulation of modern physics which, it is believed, successfully treats the behavior of all physical systems. To gain an understanding of the rudiments of modern physics, one proceeds as the early scientists did by first mastering the classical concepts that emerge from their intuitive picture of the world. Modifications of these concepts are subsequently introduced which allow a more accurate treatment of physical phenomena, particularly atomic and nuclear systems

  6. Analysis of ASTEC code adaptability to severe accident simulation for CANDU type reactors

    International Nuclear Information System (INIS)

    Constantin, Marin; Rizoiu, Andrei

    2008-01-01

    In order to prepare the adaptation of the ASTEC code to CANDU NPP severe accident analysis two kinds of activities were performed: - analyses of the ASTEC modules from the point of view of models and options, followed by CANDU exploratory calculation for the appropriate modules/models; - preparing the specifications for ASTEC adaptation for CANDU NPP. The paper is structured in three parts: - a comparison of PWR and CANDU concepts (from the point of view of severe accident phenomena); - exploratory calculations with some ASTEC modules- SOPHAEROS, CPA, IODE, CESAR, DIVA - for CANDU type reactors specific problems; - development needs analysis - algorithms, methods, modules. (authors)

  7. Behavior Analysis Usage with Behavior Tures Adoption for Malicious Code Detection on JAVASCRIPT Scenarios Example

    Directory of Open Access Journals (Sweden)

    Y. M. Tumanov

    2010-03-01

    Full Text Available The article offers the method of malicious JavaScript code detection, based on behavior analysis. Conceptions of program behavior, program state, an algorithm of malicious code detection are described.

  8. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  9. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  10. Critical lengths of error events in convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1994-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  11. Critical Lengths of Error Events in Convolutional Codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Andersen, Jakob Dahl

    1998-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  12. Working research codes into fluid dynamics education: a science gateway approach

    Science.gov (United States)

    Mason, Lachlan; Hetherington, James; O'Reilly, Martin; Yong, May; Jersakova, Radka; Grieve, Stuart; Perez-Suarez, David; Klapaukh, Roman; Craster, Richard V.; Matar, Omar K.

    2017-11-01

    Research codes are effective for illustrating complex concepts in educational fluid dynamics courses, compared to textbook examples, an interactive three-dimensional visualisation can bring a problem to life! Various barriers, however, prevent the adoption of research codes in teaching: codes are typically created for highly-specific `once-off' calculations and, as such, have no user interface and a steep learning curve. Moreover, a code may require access to high-performance computing resources that are not readily available in the classroom. This project allows academics to rapidly work research codes into their teaching via a minimalist `science gateway' framework. The gateway is a simple, yet flexible, web interface allowing students to construct and run simulations, as well as view and share their output. Behind the scenes, the common operations of job configuration, submission, monitoring and post-processing are customisable at the level of shell scripting. In this talk, we demonstrate the creation of an example teaching gateway connected to the Code BLUE fluid dynamics software. Student simulations can be run via a third-party cloud computing provider or a local high-performance cluster. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).

  13. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  14. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  15. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  16. On the Need of Network coding for Mobile Clouds

    DEFF Research Database (Denmark)

    Fitzek, Frank; Heide, Janus; Pedersen, Morten Videbæk

    for mobile clouds. The paper will list the benefits of network coding for mobile clouds as well as introduce both concepts in a tutorial way. The results used throughout this paper are collaborative work of different research institutes, but mainly taken from the mobile device group at Aalborg University.......This paper advocates the need of network coding for mobile clouds. Mobile clouds as well as network coding are describing two novel concepts. The concept of mobile clouds describes the potential of mobile devices to communicate with each other and form a cooperative cluster in which new services...... and potentials are created. Network coding on the other side enables the mobile cloud to communicate in a very efficient and secure way in terms of energy and bandwidth usage. Even though network coding can be applied in a variety of communication networks, it has some inherent features that makes it suitable...

  17. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  18. GOTHIC code evaluation of alternative passive containment cooling features

    International Nuclear Information System (INIS)

    Gavrilas, M.; Todreas, E.N.; Driscoll, M.J.

    1996-01-01

    Reliance on passive cooling has become an important objective in containment design. Several reactor concepts have been set forth, which are equipped with entirely passively cooled containments. However, the problems that have to be overcome in rejecting the entire heat generated by a severe accident in a high-rating reactor (i.e. one with a rating greater than 1200 MW e ) have been found to be substantial and without obvious solutions. The GOTHIC code was verified and modified for containment cooling applications; optimal mesh sizes, computational time steps and applicable heat transfer correlations were examined. The effect of the break location on circulation patterns that develop inside the containment was also evaluated. The GOTHIC code was then employed to assess the effectiveness of several original heat rejection features that make it possible to cool high-rating containments. Two containment concepts were evaluated: one for a 1200 MW e new pressure tube light-water reactor, and one for a 1300 MW e pressurized-water reactor. The effectiveness of various containment configurations that include specific pressure-limiting features has been predicted. The best-performance configurations-worst-case-accident scenarios that were examined yielded peak pressures of less than 0.30 MPa for the 1200 MW e pressure tube light-water reactor, and less than 0.45 MPa for the 1300 MW e pressurized-water reactor. (orig.)

  19. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  20. Utility experience in code updating of equipment built to 1974 code, Section 3, Subsection NF

    International Nuclear Information System (INIS)

    Rao, K.R.; Deshpande, N.

    1990-01-01

    This paper addresses changes to ASME Code Subsection NF and reconciles the differences between the updated codes and the as built construction code, of ASME Section III, 1974 to which several nuclear plants have been built. Since Section III is revised every three years and replacement parts complying with the construction code are invariably not available from the plant stock inventory, parts must be procured from vendors who comply with the requirements of the latest codes. Aspects of the ASME code which reflect Subsection NF are identified and compared with the later Code editions and addenda, especially up to and including the 1974 ASME code used as the basis for the plant qualification. The concern of the regulatory agencies is that if later code allowables and provisions are adopted it is possible to reduce the safety margins of the construction code. Areas of concern are highlighted and the specific changes of later codes are discerned; adoption of which, would not sacrifice the intended safety margins of the codes to which plants are licensed

  1. The hue of concepts.

    Science.gov (United States)

    Albertazzi, Liliana; Canal, Luisa; Malfatti, Michela; Micciolo, Rocco

    2013-01-01

    The study shows a systematic naturally biased association between percepts and concepts. Specifically, it shows that a series of terms pertaining to an abstract semantic field (related to the frame of ethics in social behaviour) has a nonrandom, highly significant, association with colours (hues). This is the first time that consistent associations between abstract terms and colours have been reported in the general population. The main hypothesis, ie that there appear to be 'hues of concepts', was borne out by the results: the abstract terms considered were coloured with blue/green (ie cool) colours as well as their synonyms, while their antonyms were coloured with red/yellow (ie warm) colours. The association provides information about the nature of abstract concepts and their relationship with perception. It also sheds light on the interrelations among words in semantic domains that, to date, have been studied from only a computational viewpoint.

  2. Special issue on network coding

    Science.gov (United States)

    Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly

    2017-12-01

    Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.

  3. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  4. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  5. A Semantic Analysis Method for Scientific and Engineering Code

    Science.gov (United States)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  6. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  7. Understanding pressure: didactical transpositions and pupils' conceptions

    Science.gov (United States)

    Kariotogloy, P.; Psillos, D.; Vallassiades, O.

    1990-03-01

    Using the concept of pressure two research trends-content analysis and pupils' conceptions of subject matter-are drawn together, in an attempt to understand the issues in teaching and learning specific domains of physics.

  8. Concepts of Integration for UAS Operations in the NAS

    Science.gov (United States)

    Consiglio, Maria C.; Chamberlain, James P.; Munoz, Cesar A.; Hoffler, Keith D.

    2012-01-01

    One of the major challenges facing the integration of Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) is the lack of an onboard pilot that can comply with the legal requirement identified in the US Code of Federal Regulations (CFR) that pilots see and avoid other aircraft. UAS will be expected to demonstrate the means to perform the function of see and avoid while preserving the safety level of the airspace and the efficiency of the air traffic system. This paper introduces a Sense and Avoid (SAA) concept for integration of UAS into the NAS that is currently being developed by the National Aeronautics and Space Administration (NASA) and identifies areas that require additional experimental evaluation to further inform various elements of the concept. The concept design rests on interoperability principles that take into account both the Air Traffic Control (ATC) environment as well as existing systems such as the Traffic Alert and Collision Avoidance System (TCAS). Specifically, the concept addresses the determination of well clear values that are large enough to avoid issuance of TCAS corrective Resolution Advisories, undue concern by pilots of proximate aircraft and issuance of controller traffic alerts. The concept also addresses appropriate declaration times for projected losses of well clear conditions and maneuvers to regain well clear separation.

  9. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  10. Faith: a concept analysis.

    Science.gov (United States)

    Dyess, Susan Macleod

    2011-12-01

    This paper reports a concept analysis of faith. There are numerous scholars who consider spirituality and religiosity as they relate to health and nursing. Faith is often implied as linked to these concepts but deserves distinct exploration. In addition, as nursing practice conducted within communities of faith continues to emerge, concept clarification of faith is warranted. Qualitative analysis deliberately considered the concept of faith within the lens of Margaret Newman's health as expanding consciousness. Data sources used included a secondary analysis of stories collected within a study conducted in 2008, two specific reconstructed stories, the identification of attributes noted within these various stories and selected philosophical literature from 1950 to 2009.  A definition was identified from the analysis; faith is an evolving pattern of believing, that grounds and guides authentic living and gives meaning in the present moment of inter-relating. Four key attributes of faith were also identified as focusing on beliefs, foundational meaning for life, living authentically in accordance with beliefs, and interrelating with self, others and/or Divine. Although a seemingly universal concept, faith was defined individually. Faith appeared to be broader than spiritual practices and religious ritual and became the very foundation that enabled human beings to make sense of their world and circumstances. More work is needed to understand how faith community nursing can expand the traditional understanding of denominationally defined faith community practices and how nurses can support faith for individuals with whom they encounter within all nursing practice. © 2011 Blackwell Publishing Ltd.

  11. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  12. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  13. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  14. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  15. From Concepts to Predicates within Constructivist Epistemology

    DEFF Research Database (Denmark)

    Badie, Farshad

    2017-01-01

    Constructivism is a philosophical approach that appears in a variety of guises, some of them pedagogical, some epistemological and some in complex combinations. This article is based on constructivist epistemology. More specifically, constructivist epistemology provides a ground for conceptual an...... analysis of humans’ concept constructions, conceptions and concept learning processes. It will focus on conceptual specification and logical description of a flow from concepts to predicates....

  16. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  18. Analysis specifications for the CC3 geosphere model GEONET

    International Nuclear Information System (INIS)

    Melnyk, T.W.

    1995-04-01

    AECL is assessing a concept for disposing of Canada's nuclear fuel waste in a sealed vault deep in plutonic rock of the Canadian Shield. A computer program has been developed as an analytical tool for the postclosure assessment case study, a system model, CC3 (Canadian Concept, generation 3), has been developed to describe a hypothetical disposal system. This system model includes separate models for the engineered barriers within the disposal vault, the geosphere in which the vault is emplaced, and the biosphere in the vicinity of any discharge zones. The system model is embedded within a computer code SYVAC3, (SYstems Variability Analysis Code, generation 3), which takes parameter uncertainty into account by repeated simulation of the system. GEONET (GEOsphere NETwork) is the geosphere model component of this system model. It simulates contaminant transport from the vault to the biosphere along a transport network composed of one-dimensional transport segments that are connected together in three-dimensional space. This document is a set of specifications for GEONET that were developed over a number of years. Improvements to the code will be based on revisions to these specifications. The specifications consist of a model synopsis, describing all the relevant equations and assumptions used in the model, a set of formal data flow diagrams and minispecifications, and a data dictionary. (author). 26 refs., 20 figs

  19. Coherent concepts are computed in the anterior temporal lobes.

    Science.gov (United States)

    Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J

    2010-02-09

    In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.

  20. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  1. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  2. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  3. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  4. SEVERAL OBSERVATIONS REGARDING THE REGULATION OF THE CONTRACT OF PARTNERSHIP IN THE NEW CIVIL CODE

    Directory of Open Access Journals (Sweden)

    IOLANDA-ELENA CADARIU-LUNGU

    2012-05-01

    Full Text Available Following the model of the Italian Civil Code, of the Civil Code from Quebec, the Swiss and the Dutch ones, the new Romanian Civil Code has adopted the monist conception of regulating the private law relationships, gathering in the same normative act traditional civil law dispositions as well as dispositions that are specific to the commercial relationships among professionals. In this regulating context, one of the fundamental changes the new Civil Code brings is the unification of the legal regime applicable to civil and commercial contracts, with all the consequences that derive from this new legislative approach. This fundamental modification is first determined by the profound change of the character of social, economic and juridical relationships, by the change of the cultural level of the Romanian society, by the closeness of the two branches of civil and commercial law and, last but not least, by the evolution of the business environment. In this line of thought, we can identify important changes in the matter of the contract of partnership which, as regulated by the new Civil Code, constitutes the common law both for the simple partnerships (former civil societies as well as for the commercial companies, to which the special legislation still in force in the matter still applies. In this study we aimed at analyzing the general common features of all associative forms listed by art. 1.888 Civil Code and the new elements in the matter, with critical observations where needed, which take the form of a comparison with the specific legislation in the field from the Civil Codes that served as a source of inspiration for the Romanian legislator.

  5. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-12-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  6. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-04-11

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaickingand 4D light field view synthesis.

  7. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup; Swanson, Robin; Heide, Felix; Wetzstein, Gordon; Heidrich, Wolfgang

    2017-01-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  8. Developing Trustworthy Commissioned Officers: Transcending the Honor Codes and Concept

    Science.gov (United States)

    2012-10-01

    prima   facie  evidence  that  one  has  been  honorable.  This  assumption  is...the   staff   and   faculty   at   each   commissioning   source  have  an   obligation   to   show  Cadets,  Midshipmen...and belief in the competency, character, and commitment of an institution, organization, group, or individual to fulfill obligations and

  9. Novel Concepts for Device to Device Communication using Network Coding

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Hundebøll, Martin; Pedersen, Morten Videbæk

    2014-01-01

    Device-to-device communication is currently a hot research topic within 3GPP. Even though D2D communication has been part of previous ad hoc, meshed and sensor networks proposals, the main contribution by 3GPP is that the direct communication among two devices is carried out over a dynamically as...

  10. Cognitive Architecture of Common and Scientific Concepts

    Science.gov (United States)

    Tarábek, Paul

    2010-07-01

    The cognitive architecture of concept is a specific structure consisting of the concept core, concept periphery, the semantic frame as the meaning and the sense of the concept, and the relations among all components of this structure. The model of the cognitive architecture of scientific and common concepts is a conceptual meta-model built upon Vygotsky's concept theory, Fillmore's semantic frame, semantic triangle, on widespread ideas of the structuring of conceptual systems, and the Hestenes' Modeling Theory. The method of semantic mapping of concepts flowing from the model is designed.

  11. Electromagnetic reprogrammable coding-metasurface holograms.

    Science.gov (United States)

    Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang

    2017-08-04

    Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.

  12. Development and application of the BOA code in Spain

    International Nuclear Information System (INIS)

    Tortuero Lopez, C.; Doncel Gutierrez, N.; Culebras, F.

    2012-01-01

    The BOA code allows to quantitatively establish the level of risk of Axial Offset Anomaly and increased deposition of crud on the basis of specific conditions in each case. For this reason, the code is parameterized according to the individual characteristics of each plant. This paper summarizes the results obtained in the implementation of the code, as well as its future perspective.

  13. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  14. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  15. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  16. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  17. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  18. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  19. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  20. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  1. Digital color acquisition, perception, coding and rendering

    CERN Document Server

    Fernandez-Maloigne, Christine; Macaire, Ludovic

    2013-01-01

    In this book the authors identify the basic concepts and recent advances in the acquisition, perception, coding and rendering of color. The fundamental aspects related to the science of colorimetry in relation to physiology (the human visual system) are addressed, as are constancy and color appearance. It also addresses the more technical aspects related to sensors and the color management screen. Particular attention is paid to the notion of color rendering in computer graphics. Beyond color, the authors also look at coding, compression, protection and quality of color images and videos.

  2. Particle tracing code for multispecies gas

    International Nuclear Information System (INIS)

    Eaton, R.R.; Fox, R.L.; Vandevender, W.H.

    1979-06-01

    Details are presented for the development of a computer code designed to calculate the flow of a multispecies gas mixture using particle tracing techniques. The current technique eliminates the need for a full simulation by utilizing local time averaged velocity distribution functions to obtain the dynamic properties for probable collision partners. The development of this concept reduces statistical scatter experienced in conventional Monte Carlo simulations. The technique is applicable to flow problems involving gas mixtures with disparate masses and trace constituents in the Knudsen number, Kn, range from 1.0 to less than 0.01. The resulting code has previously been used to analyze several aerodynamic isotope enrichment devices

  3. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  4. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  5. Code of ethics for dental researchers.

    Science.gov (United States)

    2014-01-01

    The International Association for Dental Research, in 2009, adopted a code of ethics. The code applies to members of the association and is enforceable by sanction, with the stated requirement that members are expected to inform the association in cases where they believe misconduct has occurred. The IADR code goes beyond the Belmont and Helsinki statements by virtue of covering animal research. It also addresses issues of sponsorship of research and conflicts of interest, international collaborative research, duty of researchers to be informed about applicable norms, standards of publication (including plagiarism), and the obligation of "whistleblowing" for the sake of maintaining the integrity of the dental research enterprise as a whole. The code is organized, like the ADA code, into two sections. The IADR principles are stated, but not defined, and number 12, instead of the ADA's five. The second section consists of "best practices," which are specific statements of expected or interdicted activities. The short list of definitions is useful.

  6. User`s manual for the CC3 computer models of the concept for disposal of Canada`s nuclear fuel waste

    Energy Technology Data Exchange (ETDEWEB)

    Dougan, K D; Wojciechowski, L C

    1995-06-01

    Atomic Energy of Canada Limited (AECL) is assessing a concept for disposing of CANDU reactor fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept, and for environmental assessments of other systems. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system, which is represented by the CC3 (Canadian Concept, generation 3) models comprising a design-specific vault, a site-specific geosphere and a biosphere typical of the Canadian Shield. (author). 23 refs., 7 tabs., 21 figs.

  7. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  8. General Monte Carlo code MONK

    International Nuclear Information System (INIS)

    Moore, J.G.

    1974-01-01

    The Monte Carlo code MONK is a general program written to provide a high degree of flexibility to the user. MONK is distinguished by its detailed representation of nuclear data in point form i.e., the cross-section is tabulated at specific energies instead of the more usual group representation. The nuclear data are unadjusted in the point form but recently the code has been modified to accept adjusted group data as used in fast and thermal reactor applications. The various geometrical handling capabilities and importance sampling techniques are described. In addition to the nuclear data aspects, the following features are also described; geometrical handling routines, tracking cycles, neutron source and output facilities. 12 references. (U.S.)

  9. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  10. Organization of Risk Analysis Codes for Living Evaluations (ORACLE)

    International Nuclear Information System (INIS)

    Batt, D.L.; MacDonald, P.E.; Sattison, M.B.; Vesely, E.

    1987-01-01

    ORACLE (Organization of Risk Analysis Codes for Living Evaluations) is an integration concept for using risk-based information in United States Nuclear Regulatory Commission (USNRC) applications. Portions of ORACLE are being developed at the Idaho Nationale Engineering Laboratory for the USNRC. The ORACLE concept consists of related databases, software, user interfaces, processes, and quality control checks allowing a wide variety of regulatory problems and activities to be addressed using current, updated PRA information. The ORACLE concept provides for smooth transitions between one code and the next without pre- or post-processing. (orig.)

  11. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  12. Improving developer productivity with C++ embedded domain specific languages

    Science.gov (United States)

    Kozacik, Stephen; Chao, Evenie; Paolini, Aaron; Bonnett, James; Kelmelis, Eric

    2017-05-01

    Domain-specific languages are a useful tool for productivity allowing domain experts to program using familiar concepts and vocabulary while benefiting from performance choices made by computing experts. Embedding the domain specific language into an existing language allows easy interoperability with non-domain-specific code and use of standard compilers and build systems. In C++, this is enabled through the template and preprocessor features. C++ embedded domain specific languages (EDSLs) allow the user to write simple, safe, performant, domain specific code that has access to all the low-level functionality that C and C++ offer as well as the diverse set of libraries available in the C/C++ ecosystem. In this paper, we will discuss several tools available for building EDSLs in C++ and show examples of projects successfully leveraging EDSLs. Modern C++ has added many useful new features to the language which we have leveraged to further extend the capability of EDSLs. At EM Photonics, we have used EDSLs to allow developers to transparently benefit from using high performance computing (HPC) hardware. We will show ways EDSLs combine with existing technologies and EM Photonics high performance tools and libraries to produce clean, short, high performance code in ways that were not previously possible.

  13. Ethical Code Effectiveness in Football Clubs: A Longitudinal Analysis

    OpenAIRE

    Constandt, Bram; De Waegeneer, Els; Willem, Annick

    2017-01-01

    As football (soccer) clubs are facing different ethical challenges, many clubs are turning to ethical codes to counteract unethical behaviour. However, both in- and outside the sport field, uncertainty remains about the effectiveness of these ethical codes. For the first time, a longitudinal study design was adopted to evaluate code effectiveness. Specifically, a sample of non-professional football clubs formed the subject of our inquiry. Ethical code effectiveness was...

  14. Transcranial Direct Current Stimulation Targeting Primary Motor Versus Dorsolateral Prefrontal Cortices: Proof-of-Concept Study Investigating Functional Connectivity of Thalamocortical Networks Specific to Sensory-Affective Information Processing.

    Science.gov (United States)

    Sankarasubramanian, Vishwanath; Cunningham, David A; Potter-Baker, Kelsey A; Beall, Erik B; Roelle, Sarah M; Varnerin, Nicole M; Machado, Andre G; Jones, Stephen E; Lowe, Mark J; Plow, Ela B

    2017-04-01

    The pain matrix is comprised of an extensive network of brain structures involved in sensory and/or affective information processing. The thalamus is a key structure constituting the pain matrix. The thalamus serves as a relay center receiving information from multiple ascending pathways and relating information to and from multiple cortical areas. However, it is unknown how thalamocortical networks specific to sensory-affective information processing are functionally integrated. Here, in a proof-of-concept study in healthy humans, we aimed to understand this connectivity using transcranial direct current stimulation (tDCS) targeting primary motor (M1) or dorsolateral prefrontal cortices (DLPFC). We compared changes in functional connectivity (FC) with DLPFC tDCS to changes in FC with M1 tDCS. FC changes were also compared to further investigate its relation with individual's baseline experience of pain. We hypothesized that resting-state FC would change based on tDCS location and would represent known thalamocortical networks. Ten right-handed individuals received a single application of anodal tDCS (1 mA, 20 min) to right M1 and DLPFC in a single-blind, sham-controlled crossover study. FC changes were studied between ventroposterolateral (VPL), the sensory nucleus of thalamus, and cortical areas involved in sensory information processing and between medial dorsal (MD), the affective nucleus, and cortical areas involved in affective information processing. Individual's perception of pain at baseline was assessed using cutaneous heat pain stimuli. We found that anodal M1 tDCS and anodal DLPFC tDCS both increased FC between VPL and sensorimotor cortices, although FC effects were greater with M1 tDCS. Similarly, anodal M1 tDCS and anodal DLPFC tDCS both increased FC between MD and motor cortices, but only DLPFC tDCS modulated FC between MD and affective cortices, like DLPFC. Our findings suggest that M1 stimulation primarily modulates FC of sensory networks

  15. On Coding Non-Contiguous Letter Combinations

    Directory of Open Access Journals (Sweden)

    Frédéric eDandurand

    2011-06-01

    Full Text Available Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity.

  16. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  17. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  18. Electrical, instrumentation, and control codes and standards

    International Nuclear Information System (INIS)

    Kranning, A.N.

    1978-01-01

    During recent years numerous documents in the form of codes and standards have been developed and published to provide design, fabrication and construction rules and criteria applicable to instrumentation, control and power distribution facilities for nuclear power plants. The contents of this LTR were prepared by NUS Corporation under Subcontract K5108 and provide a consolidated index and listing of the documents selected for their application to procurement of materials and design of modifications and new construction at the LOFT facility. These codes and standards should be applied together with the National Electrical Code, the ID Engineering Standards and LOFT Specifications to all LOFT instrument and electrical design activities

  19. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  20. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  1. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  2. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  3. Description and application of the AERIN Code at LLNL

    International Nuclear Information System (INIS)

    King, W.C.

    1986-01-01

    The AERIN code was written at the Lawrence Livermore National Laboratory in 1976 to compute the organ burdens and absorbed dose resulting from a chronic or acute inhalation of transuranic isotopes. The code was revised in 1982 to reflect the concepts of ICRP-30. This paper will describe the AERIN code and how it has been used at LLNL to study more than 80 cases of internal deposition and obtain estimates of internal dose. A comparison with the computed values of the committed organ dose is made with ICRP-30 values. The benefits of using the code are described. 3 refs., 3 figs., 6 tabs

  4. Practicing the Code of Ethics, finding the image of God.

    Science.gov (United States)

    Hoglund, Barbara A

    2013-01-01

    The Code of Ethics for Nurses gives a professional obligation to practice in a compassionate and respectful way that is unaffected by the attributes of the patient. This article explores the concept "made in the image of God" and the complexities inherent in caring for those perceived as exhibiting distorted images of God. While the Code provides a professional standard consistent with a biblical worldview, human nature impacts the ability to consistently act congruently with the Code. Strategies and nursing interventions that support development of practice from a biblical worldview and the Code of Ethics for Nurses are presented.

  5. Computer Code for Interpreting 13C NMR Relaxation Measurements with Specific Models of Molecular Motion: The Rigid Isotropic and Symmetric Top Rotor Models and the Flexible Symmetric Top Rotor Model

    Science.gov (United States)

    2017-01-01

    top rotor superimposes an effective correlation time, τe, onto a symmetric top rotor to account for internal motion. 2. THEORY The purpose...specifically describe how simple 13C relaxation theory is used to describe quantitatively simple molecular 3 motions. More-detailed accounts ...of nuclear magnetic relaxation can be found in a number of basic textbooks (i.e., Farrar and Becker, 1971; Fukushima and Roeder, 1981; Harris, 1986

  6. Advanced Code for Photocathode Design

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Jensen, Kevin [Naval Research Lab. (NRL), Washington, DC (United States); Montgomery, Eric [Univ. of Maryland, College Park, MD (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  7. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  8. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  9. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  10. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  11. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  12. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  13. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  14. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  15. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  16. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  17. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  18. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  19. Sparsity in Linear Predictive Coding of Speech

    DEFF Research Database (Denmark)

    Giacobello, Daniele

    of the effectiveness of their application in audio processing. The second part of the thesis deals with introducing sparsity directly in the linear prediction analysis-by-synthesis (LPAS) speech coding paradigm. We first propose a novel near-optimal method to look for a sparse approximate excitation using a compressed...... one with direct applications to coding but also consistent with the speech production model of voiced speech, where the excitation of the all-pole filter can be modeled as an impulse train, i.e., a sparse sequence. Introducing sparsity in the LP framework will also bring to de- velop the concept...... sensing formulation. Furthermore, we define a novel re-estimation procedure to adapt the predictor coefficients to the given sparse excitation, balancing the two representations in the context of speech coding. Finally, the advantages of the compact parametric representation of a segment of speech, given...

  20. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  1. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  2. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  3. A method for scientific code coupling in a distributed environment

    International Nuclear Information System (INIS)

    Caremoli, C.; Beaucourt, D.; Chen, O.; Nicolas, G.; Peniguel, C.; Rascle, P.; Richard, N.; Thai Van, D.; Yessayan, A.

    1994-12-01

    This guide book deals with coupling of big scientific codes. First, the context is introduced: big scientific codes devoted to a specific discipline coming to maturity, and more and more needs in terms of multi discipline studies. Then we describe different kinds of code coupling and an example of code coupling: 3D thermal-hydraulic code THYC and 3D neutronics code COCCINELLE. With this example we identify problems to be solved to realize a coupling. We present the different numerical methods usable for the resolution of coupling terms. This leads to define two kinds of coupling: with the leak coupling, we can use explicit methods, and with the strong coupling we need to use implicit methods. On both cases, we analyze the link with the way of parallelizing code. For translation of data from one code to another, we define the notion of Standard Coupling Interface based on a general structure for data. This general structure constitutes an intermediary between the codes, thus allowing a relative independence of the codes from a specific coupling. The proposed method for the implementation of a coupling leads to a simultaneous run of the different codes, while they exchange data. Two kinds of data communication with message exchange are proposed: direct communication between codes with the use of PVM product (Parallel Virtual Machine) and indirect communication with a coupling tool. This second way, with a general code coupling tool, is based on a coupling method, and we strongly recommended to use it. This method is based on the two following principles: re-usability, that means few modifications on existing codes, and definition of a code usable for coupling, that leads to separate the design of a code usable for coupling from the realization of a specific coupling. This coupling tool available from beginning of 1994 is described in general terms. (authors). figs., tabs

  4. Overall simulation of a HTGR plant with the gas adapted MANTA code

    International Nuclear Information System (INIS)

    Emmanuel Jouet; Dominique Petit; Robert Martin

    2005-01-01

    Full text of publication follows: AREVA's subsidiary Framatome ANP is developing a Very High Temperature Reactor nuclear heat source that can be used for electricity generation as well as cogeneration including hydrogen production. The selected product has an indirect cycle architecture which is easily adapted to all possible uses of the nuclear heat source. The coupling to the applications is implemented through an Intermediate Heat exchanger. The system code chosen to calculate the steady-state and transient behaviour of the plant is based on the MANTA code. The flexible and modular MANTA code that is originally a system code for all non LOCA PWR plant transients, has been the subject of new developments to simulate all the forced convection transients of a nuclear plant with a gas cooled High Temperature Reactor including specific core thermal hydraulics and neutronics modelizations, gas and water steam turbomachinery and control structure. The gas adapted MANTA code version is now able to model a total HTGR plant with a direct Brayton cycle as well as indirect cycles. To validate these new developments, a modelization with the MANTA code of a real plant with direct Brayton cycle has been performed and steady-states and transients compared with recorded thermal hydraulic measures. Finally a comparison with the RELAP5 code has been done regarding transient calculations of the AREVA indirect cycle HTR project plant. Moreover to improve the user-friendliness in order to use MANTA as a systems conception, optimization design tool as well as a plant simulation tool, a Man- Machine-Interface is available. Acronyms: MANTA Modular Advanced Neutronic and Thermal hydraulic Analysis; HTGR High Temperature Gas-Cooled Reactor. (authors)

  5. Material report in support to RCC-MRX code 2010 stainless steel parts and products

    International Nuclear Information System (INIS)

    Ancelet, Olivier; Lebarbe, Thierry; Dubiez-Le Goff, Sophie; Bonne, Dominique; Gelineau, Odile

    2012-01-01

    This paper presents the Material Report dedicated to stainless steels parts and products issued by AFCEN (Association Francaise pour les regles de Conception et de Construction des Materiels des Chaudieres Electro-Nucleaires) in support to RCC-MRx 2010 Code. The RCC-MRx Code is the result of the merger of the RCC-MX 2008, developed in the context of the research reactor Jules Horowitz Reactor project, in the RCC-MR 2007, which set up rules applicable to the design of components operating at high temperature and to the Vacuum Vessel of ITER (a presentation of RCC-MRx 2010 Code is the subject of another paper proposed in this Congress; it explains in particular the status of this Code). This Material Report is part of a set of Criteria of RCC-MRx (this set of Criteria is under construction). The Criteria aim at explaining the design and construction rules of the Code. They cover analyses rules as well as part procurement, welding, methods of tests and examination and fabrication rules. The Material Report particularly provides justifications and explanations on requirements and features dealing with parts and products proposed in the Code. The Material Report contains the following information: Introduction of the grade(s): codes and standards and Reference Procurement Specifications covering parts and products, applications and experience gained, - Physical properties, - Mechanical properties used for design calculations (base metal and welds): basic mechanical properties, creep mechanical properties, irradiated mechanical properties, - Fabrication: experience gained, metallurgy, - Welding: weldability, experience gained during welding and repair procedure qualifications, - Non-destructive examination, - In-service behaviour. In the article, examples of data supplied in the Material Report dedicated to stainless steels will be exposed. (authors)

  6. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  7. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  8. RADTRAN II: revised computer code to analyze transportation of radioactive material

    International Nuclear Information System (INIS)

    Taylor, J.M.; Daniel, S.L.

    1982-10-01

    A revised and updated version of the RADTRAN computer code is presented. This code has the capability to predict the radiological impacts associated with specific schemes of radioactive material shipments and mode specific transport variables

  9. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  10. The computer code SEURBNUK-2

    International Nuclear Information System (INIS)

    Yerkess, A.

    1984-01-01

    SEURBNUK-2 has been designed to model the hydrodynamic development in time of a hypothetical core disrupture accident in a fast breeder reactor. SEURBNUK-2 is a two-dimensional, axisymmetric, eulerian, finite difference containment code. The numerical procedure adopted in SEURBNUK to solve the hydrodynamic equations is based on the semi-implicit ICE method. SEURBNUK has a full thin shell treatment for tanks of arbitrary shape and includes the effects of the compressibility of the fluid. Fluid flow through porous media and porous structures can also be accommodated. An important feature of SEURBNUK is that the thin shell equations are solved quite separately from those of the fluid, and the time step for the fluid flow calculation can be an integer multiple of that for calculating the shell motion. The interaction of the shell with the fluid is then considered as a modification to the coefficients in the implicit pressure equations, the modifications naturally depending on the behaviour of the thin shell section within the fluid cell. The code is limited to dealing with a single fluid, the coolant, whereas the bubble and the cover gas are treated as cavities of uniform pressure calculated via appropriate pressure-volume-energy relationships. This manual describes the input data specifications needed for the execution of SEURBNUK-2 calculations and nine sample problems of varying degrees of complexity highlight the code capabilities. After explaining the output facilities information is included to aid those unfamiliar with SEURBNUK-2 to avoid the common pit-falls experienced by novices

  11. Examining moral thinking of adolescents through intermediate concepts

    Directory of Open Access Journals (Sweden)

    Frichand Ana

    2011-01-01

    Full Text Available This study examines moral thinking of adolescents through intermediate concepts. Intermediate concepts describe a level of analysis that falls between the general default schemas defined in Kohlberg's theory and specific ethical codes. They are related to the ability to identify good and bad actions and justifications in solving specific moral dilemmas. Participants were adolescent males and females in early, middle and late adolescence. The type of education, expressed antisocial behaviour and the primary group of socialization (family were analyzed as well. The results indicate that the ability to identify good and bad actions and justifications is increasing with age. Female adolescents have higher scores on this ability than male adolescents. Individuals in late adolescence, who are concentrating more on moral values and principles during education, show higher ability in identifying bad actions and justifications. In middle adolescence those who exhibit antisocial behaviour have lower ability in identifying intermediate concepts, compared to their peers who do not show this type of behaviour. Similar results are true for those living in institutions for children without parents and parental care when compared to adolescents who are living with their parents. .

  12. The integral fast reactor concept

    International Nuclear Information System (INIS)

    Chang, Yoon I.; Marchaterre, J.F.

    1987-01-01

    The Integral Fast Reactor (IFR) is an innovative liquid metal reactor concept being developed at Argonne National Laboratory. It seeks to specifically exploit the inherent properties of liquid metal cooling and metallic fuel in a way that leads to substantial improvements in the characteristics of the complete reactor system. The IFR concept consists of four technical features: (1) liquid sodium cooling, (2) pool-type reactor configuration, (3) metallic fuel, and (4) an integral fuel cycle, based on pyrometallurgical processing and injection-cast fuel fabrication, with the fuel cycle facility collocated with the reactor, if so desired. This paper gives a review of the IFR concept

  13. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  14. SCORCH - a zero dimensional plasma evolution and transport code for use in small and large tokamak systems

    International Nuclear Information System (INIS)

    Clancy, B.E.; Cook, J.L.

    1984-12-01

    The zero-dimensional code SCORCH determines number density and temperature evolution in plasmas using concepts derived from the Hinton and Hazeltine transport theory. The code uses the previously reported ADL-1 data library

  15. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  16. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  17. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  18. Tunable wavefront coded imaging system based on detachable phase mask: Mathematical analysis, optimization and underlying applications

    Science.gov (United States)

    Zhao, Hui; Wei, Jingxuan

    2014-09-01

    The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.

  19. Development of system of computer codes for severe accident analysis and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H S; Jeon, M H; Cho, N J. and others [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1992-01-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts.

  20. Development of system of computer codes for severe accident analysis and its applications

    International Nuclear Information System (INIS)

    Jang, H. S.; Jeon, M. H.; Cho, N. J. and others

    1992-01-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts

  1. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  2. Neural Elements for Predictive Coding

    Directory of Open Access Journals (Sweden)

    Stewart SHIPP

    2016-11-01

    Full Text Available Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backwards in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forwards and backwards pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a updates in the microcircuitry of primate visual cortex, and (b rapid technical advances made

  3. Neural Elements for Predictive Coding.

    Science.gov (United States)

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many 'illusory' instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry - that neurons with extrinsically bifurcating axons do not project in both directions - has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic 'canonical microcircuit' and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural

  4. Improving the accuracy of operation coding in surgical discharge summaries

    Science.gov (United States)

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  5. An introduction to using QR codes in scholarly journals

    Directory of Open Access Journals (Sweden)

    Jae Hwa Chang

    2014-08-01

    Full Text Available The Quick Response (QR code was first developed in 1994 by Denso Wave Incorporated, Japan. From that point on, it came into general use as an identification mark for all kinds of commercial products, advertisements, and other public announcements. In scholarly journals, the QR code is used to provide immediate direction to the journal homepage or specific content such as figures or videos. To produce a QR code and print it in the print version or upload to the web is very simple. Using a QR code producing program, an editor can add simple information to a website. After that, a QR code is produced. A QR code is very stable, such that it can be used for a long time without loss of quality. Producing and adding QR codes to a journal costs nothing; therefore, to increase the visibility of their journals, it is time for editors to add QR codes to their journals.

  6. A new concept of equivalent homogenization method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Pogoskekyan, Leonid; Kim, Young Il; Ju, Hyung Kook; Chang, Moon Hee [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-07-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The offered concept covers both those of K. Koebke and K. Smith; both of them can be simulated within framework of new concept. Also, the offered concept covers Siemens KWU approach for baffle/reflector simulation, where the equivalent homogenized reflector XS are derived from the conservation of response matrix at the interface in 1D simi-infinite slab geometry. The IM and XS of new concept satisfy the same assumption about response matrix conservation in 1D semi-infinite slab geometry. It is expected that the new concept provides more accurate approximation of heterogeneous cell, especially in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are: improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith`s approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b) control blades simulation; (c) mixed UO{sub 2}/MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANDOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions. 9 figs., 7 refs. (Author).

  7. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  8. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  9. Radionuclide daughter inventory generator code: DIG

    International Nuclear Information System (INIS)

    Fields, D.E.; Sharp, R.D.

    1985-09-01

    The Daughter Inventory Generator (DIG) code accepts a tabulation of radionuclide initially present in a waste stream, specified as amounts present either by mass or by activity, and produces a tabulation of radionuclides present after a user-specified elapsed time. This resultant radionuclide inventory characterizes wastes that have undergone daughter ingrowth during subsequent processes, such as leaching and transport, and includes daughter radionuclides that should be considered in these subsequent processes or for inclusion in a pollutant source term. Output of the DIG code also summarizes radionuclide decay constants. The DIG code was developed specifically to assist the user of the PRESTO-II methodology and code in preparing data sets and accounting for possible daughter ingrowth in wastes buried in shallow-land disposal areas. The DIG code is also useful in preparing data sets for the PRESTO-EPA code. Daughter ingrowth in buried radionuclides and in radionuclides that have been leached from the wastes and are undergoing hydrologic transport are considered, and the quantities of daughter radionuclide are calculated. Radionuclide decay constants generated by DIG and included in the DIG output are required in the PRESTO-II code input data set. The DIG accesses some subroutines written for use with the CRRIS system and accesses files containing radionuclide data compiled by D.C. Kocher. 11 refs

  10. Certification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Toffer, H.; Crowe, R.D.; Ades, M.J.

    1990-05-01

    A certification plan for computer codes used in Safety Analyses and Probabilistic Risk Assessment (PRA) for the operation of the Savannah River Site (SRS) reactors has been prepared. An action matrix, checklists, and a time schedule have been included in the plan. These items identify what is required to achieve certification of the codes. A list of Safety Analysis and Probabilistic Risk Assessment (SA ampersand PRA) computer codes covered by the certification plan has been assembled. A description of each of the codes was provided in Reference 4. The action matrix for the configuration control plan identifies code specific requirements that need to be met to achieve the certification plan's objectives. The checklist covers the specific procedures that are required to support the configuration control effort and supplement the software life cycle procedures based on QAP 20-1 (Reference 7). A qualification checklist for users establishes the minimum prerequisites and training for achieving levels of proficiency in using configuration controlled codes for critical parameter calculations

  11. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  12. Sensory overload: A concept analysis.

    Science.gov (United States)

    Scheydt, Stefan; Müller Staub, Maria; Frauenfelder, Fritz; Nielsen, Gunnar H; Behrens, Johann; Needham, Ian

    2017-04-01

    In the context of mental disorders sensory overload is a widely described phenomenon used in conjunction with psychiatric interventions such as removal from stimuli. However, the theoretical foundation of sensory overload as addressed in the literature can be described as insufficient and fragmentary. To date, the concept of sensory overload has not yet been sufficiently specified or analyzed. The aim of the study was to analyze the concept of sensory overload in mental health care. A literature search was undertaken using specific electronic databases, specific journals and websites, hand searches, specific library catalogues, and electronic publishing databases. Walker and Avant's method of concept analysis was used to analyze the sources included in the analysis. All aspects of the method of Walker and Avant were covered in this concept analysis. The conceptual understanding has become more focused, the defining attributes, influencing factors and consequences are described and empirical referents identified. The concept analysis is a first step in the development of a middle-range descriptive theory of sensory overload based on social scientific and stress-theoretical approaches. This specification may serve as a fundament for further research, for the development of a nursing diagnosis or for guidelines. © 2017 Australian College of Mental Health Nurses Inc.

  13. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  14. GASP: A computer code for calculating the thermodynamic and transport properties for ten fluids: Parahydrogen, helium, neon, methane, nitrogen, carbon monoxide, oxygen, fluorine, argon, and carbon dioxide. [enthalpy, entropy, thermal conductivity, and specific heat

    Science.gov (United States)

    Hendricks, R. C.; Baron, A. K.; Peller, I. C.

    1975-01-01

    A FORTRAN IV subprogram called GASP is discussed which calculates the thermodynamic and transport properties for 10 pure fluids: parahydrogen, helium, neon, methane, nitrogen, carbon monoxide, oxygen, fluorine, argon, and carbon dioxide. The pressure range is generally from 0.1 to 400 atmospheres (to 100 atm for helium and to 1000 atm for hydrogen). The temperature ranges are from the triple point to 300 K for neon; to 500 K for carbon monoxide, oxygen, and fluorine; to 600 K for methane and nitrogen; to 1000 K for argon and carbon dioxide; to 2000 K for hydrogen; and from 6 to 500 K for helium. GASP accepts any two of pressure, temperature and density as input conditions along with pressure, and either entropy or enthalpy. The properties available in any combination as output include temperature, density, pressure, entropy, enthalpy, specific heats, sonic velocity, viscosity, thermal conductivity, and surface tension. The subprogram design is modular so that the user can choose only those subroutines necessary to the calculations.

  15. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  16. Advanced thermionic reactor systems design code

    International Nuclear Information System (INIS)

    Lewis, B.R.; Pawlowski, R.A.; Greek, K.J.; Klein, A.C.

    1991-01-01

    An overall systems design code is under development to model an advanced in-core thermionic nuclear reactor system for space applications at power levels of 10 to 50 kWe. The design code is written in an object-oriented programming environment that allows the use of a series of design modules, each of which is responsible for the determination of specific system parameters. The code modules include a neutronics and core criticality module, a core thermal hydraulics module, a thermionic fuel element performance module, a radiation shielding module, a module for waste heat transfer and rejection, and modules for power conditioning and control. The neutronics and core criticality module determines critical core size, core lifetime, and shutdown margins using the criticality calculation capability of the Monte Carlo Neutron and Photon Transport Code System (MCNP). The remaining modules utilize results of the MCNP analysis along with FORTRAN programming to predict the overall system performance

  17. QR Code: An Interactive Mobile Advertising Tool

    Directory of Open Access Journals (Sweden)

    Ela Sibel Bayrak Meydanoglu

    2013-10-01

    Full Text Available Easy and rapid interaction between consumers and marketers enabled by mobile technology prompted  an increase in the usage of mobile media as an interactive marketing tool in recent years. One of the mobile technologies that can be used in interactive marketing for advertising is QR code (Quick Response Code. Interactive advertising brings back some advantages for the companies that apply it. For example, interaction with consumers provides significant information about consumers' preferences. Marketers can use information obtained from consumers for various marketing activities such as customizing advertisement messages, determining  target audience, improving future products and services. QR codes used in marketing campaigns can provide links to specific websites in which through various tools (e.g. questionnaires, voting information about the needs and wants of customers are collected. The aim of this basic research is to illustrate the contribution of  QR codes to the realization of the advantages gained by interactive advertising.

  18. Advocacy: exploring the concept.

    Science.gov (United States)

    Mardell, A

    1996-10-01

    The concept of the nurse as the patient's advocate is one that has become popular in the last fifteen years or so in both North America and the United Kingdom, having its basis in nursing theory. The UKCC first embraced the concept, stating in the Code of Professional Conduct that nurses must; 'act always in such a manner so as to promote and safeguard the interests and well being of patients and clients'. This is a laudable principle and one that nurses cannot dispute as there are many members of our society who are weak and vulnerable and may be unable to speak up for themselves. But are nurses always in a position to be an advocate for their patients? As the nature of nursing is so diverse then the nature of advocacy will be different in the multifarious settings in which nurses practise. Can theatre nurses ever be in a position to act as an advocate for a patient who is often anaesthetised? What precisely is advocacy and is the Concise Oxford Dictionary definition of 'one who pleads for another' appropriate in the nursing context? Then there is the position of nurses in the healthcare organisation in which they practise. In advocating for their patients, nurses may find they are pleading a case for a patient, or a group of patients, that could bring the nurse into conflict with their medical colleagues or with the management of the organisation by whom they are employed. Additionally, they may not posses the skills and knowledge to advocate effectively under such circumstances. Nursing is littered with the casualties of such conflicts over the years, the most publicised of whom, in the UK, was probably Graham Pink who lost his job as a charge nurse after drawing public attention to what he considered to be an unacceptable standard of care in the hospital in which he worked.

  19. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  20. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)